As was foretold, we've added advertisements to the forums! If you have questions, or if you encounter any bugs, please visit this thread: https://forums.penny-arcade.com/discussion/240191/forum-advertisement-faq-and-reports-thread/
Options

[PC Build Thread] Nope, you still can't buy anything

1246799

Posts

  • Options
    ThawmusThawmus +Jackface Registered User regular
    Namrok wrote: »
    Dr. Chaos wrote: »
    Dr. Chaos wrote: »
    Was wondering if you guys could help me figure out what I need to improve to get sixty fps in CP out of my desktop. Running into some surprisingly low performance in Cyberpunk.

    I play at 1600X900 resolution with a RX 580 8gb, 16gb RAM, AMD Ryzen 5 2600 Six-Core Processor and good NVME SSD but lowering /turning off settings (been reading optimization guides) doesn't really seem to be stopping it from dipping into the 20-45 range often.

    Does it look like I need an upgrade to get 60 FPS consistency out of it? Just bought this GPU less than six months ago but if I got to buy a new one to brute force 60 frames out of this thing on medium/high, I'll do it if I have to.

    Is it the CPU?

    I think it's both but mostly the GPU. A cursory glance through reddit shows 580s are running the game at low to medium at 1080p to get even close to 60fps, and those are with Ryzen 3xxx CPUs. Granted you're running a lower res, but I think hoping for 60fps and high settings on a 580 may be a bit much.
    Dr. Chaos wrote: »
    Was wondering if you guys could help me figure out what I need to improve to get sixty fps in CP out of my desktop. Running into some surprisingly low performance in Cyberpunk.

    I play at 1600X900 resolution with a RX 580 8gb, 16gb RAM, AMD Ryzen 5 2600 Six-Core Processor and good NVME SSD but lowering /turning off settings (been reading optimization guides) doesn't really seem to be stopping it from dipping into the 20-45 range often.

    Does it look like I need an upgrade to get 60 FPS consistency out of it? Just bought this GPU less than six months ago but if I got to buy a new one to brute force 60 frames out of this thing on medium/high, I'll do it if I have to.

    Is it the CPU?

    I think it's both but mostly the GPU. A cursory glance through reddit shows 580s are running the game at low to medium at 1080p to get even close to 60fps, and those are with Ryzen 3xxx CPUs. Granted you're running a lower res, but I think hoping for 60fps and high settings on a 580 may be a bit much.
    I see.

    Is there a GPU under or at 300 bucks you guys would reccomend thatll run it at my settings no problem?

    Should probably get something that will last me a few years of this gen atleast.


    Man, that's a rough question. The one you can get honestly.

    My local Microcenter lists no RTX 2000 or 3000 series cards in stock, or GTX 1600 series, or even GTX 1000 series! In fact, it looks like the best card they have is 2019's RX 5700 XT for $400. Looks like Newegg isn't any better.

    Even used last gen cards are reselling for MSRP still, the supply is so constrained. Every RTX 2070 Super near me is selling for $500 used. With a brand spanking new RTX 3060 TI selling for $400 and handily beating it's performance, you'd think people would price their used cards appropriately. But supply is that constrained.

    It's basically the worst time ever to be in the market for a GPU. I've never seen so few current gen, or last gen, cards on the market.

    All of this.

    Like, my price range is anywhere from $500-$1100 at this point and I can't get my mitts on a card at all.

    For $300 you're basically talking a last-gen card sometime after we work our way through this bot scalper bullshit, which most people are convinced we'll be out of in 2-3 months, which I think is super duper optimistic.

    Twitch: Thawmus83
  • Options
    NamrokNamrok Registered User regular
    3clipse wrote: »
    3clipse wrote: »
    3clipse wrote: »
    Thawmus wrote: »
    I mean GN released a video review yesterday of CP2077 performance across all the cards, RTX on and off, and frankly most of the review was with RTX off, so it raises some questions of just how much HUB ignores RTX performance.

    Looking at HUB's CP2077 performance review, at a glance it doesn't look like they benchmarked RTX performance at all. I'm just going off of the jumps in the video, please correct me if I'm wrong.

    This is correct, they don't do RT or DLSS often/at all because it's not "equal" with AMD

    I think it's fairly reasonable for a company that's providing them with thousands of dollars of hardware for free to ask they actually benchmark what the hardware was built for.

    Well that's extremely stupid, these are real features that exist and which people use, they can't just be ignored to achieve some kind of fake parity.

    HUB seems to think so.

    I stopped watching them a little bit ago because they started whining about their viewers wanting to at least SEE DLSS 2.0 numbers, and dude steadfastly refused. Which is nuts because it's a hardware feature!

    They don't do AMD's direct access memory either!

    whyyyyyyy

    Refusal to move on from raster-only benchmarks, which are starting down the same path single-core CPU performance metrics have gone.

    It's the children who are yadda yadda

    A part of me is sympathetic to trying to stick to apples to apples raster comparisons. I think it's wrong. But I'm sympathetic. The feature sets of AMD and Nvidia cards are so divergent at this point, the matrix of benchmarks you need to run to get a comprehensive overview of their capabilities is really exploding in complexity. RTX on/off, DLSS on/off, Rage Mode on/off, SAM on/off. Games which are known to favor AMD, games which are known to favor Nvidia. Games which seemingly explicitly disfavor AMD more so than others (like Minecraft RTX).

    I haven't seen the hardware reviewers job look this complicated in a long, long ass time. Possibly not since the mid 90's when 3D was new, and seemingly every card has their own API with wildly different support.

  • Options
    OrcaOrca Also known as Espressosaurus WrexRegistered User regular
    Namrok wrote: »
    Man, that's a rough question. The one you can get honestly.

    My local Microcenter lists no RTX 2000 or 3000 series cards in stock, or GTX 1600 series, or even GTX 1000 series! In fact, it looks like the best card they have is 2019's RX 5700 XT for $400. Looks like Newegg isn't any better.

    Even used last gen cards are reselling for MSRP still, the supply is so constrained. Every RTX 2070 Super near me is selling for $500 used. With a brand spanking new RTX 3060 TI selling for $400 and handily beating it's performance, you'd think people would price their used cards appropriately. But supply is that constrained.

    It's basically the worst time ever to be in the market for a GPU. I've never seen so few current gen, or last gen, cards on the market.

    It's completely nuts.

    You can't get current gen for love or money. You might get lucky and get a shitty last gen (2060). Maybe. it's sold out pretty much everywhere too. Gen before last is just about out of stock everywhere until you get down to the 1660 or below. Has it literally ever been this bad for GPUs? Or even any computer components at all? I don't recall anything being this constrained ever, and my vague tracking of this sort of thing goes back at least 25 years.

  • Options
    jungleroomxjungleroomx It's never too many graves, it's always not enough shovels Registered User regular
    edited December 2020
    Namrok wrote: »
    3clipse wrote: »
    3clipse wrote: »
    3clipse wrote: »
    Thawmus wrote: »
    I mean GN released a video review yesterday of CP2077 performance across all the cards, RTX on and off, and frankly most of the review was with RTX off, so it raises some questions of just how much HUB ignores RTX performance.

    Looking at HUB's CP2077 performance review, at a glance it doesn't look like they benchmarked RTX performance at all. I'm just going off of the jumps in the video, please correct me if I'm wrong.

    This is correct, they don't do RT or DLSS often/at all because it's not "equal" with AMD

    I think it's fairly reasonable for a company that's providing them with thousands of dollars of hardware for free to ask they actually benchmark what the hardware was built for.

    Well that's extremely stupid, these are real features that exist and which people use, they can't just be ignored to achieve some kind of fake parity.

    HUB seems to think so.

    I stopped watching them a little bit ago because they started whining about their viewers wanting to at least SEE DLSS 2.0 numbers, and dude steadfastly refused. Which is nuts because it's a hardware feature!

    They don't do AMD's direct access memory either!

    whyyyyyyy

    Refusal to move on from raster-only benchmarks, which are starting down the same path single-core CPU performance metrics have gone.

    It's the children who are yadda yadda

    A part of me is sympathetic to trying to stick to apples to apples raster comparisons. I think it's wrong. But I'm sympathetic. The feature sets of AMD and Nvidia cards are so divergent at this point, the matrix of benchmarks you need to run to get a comprehensive overview of their capabilities is really exploding in complexity. RTX on/off, DLSS on/off, Rage Mode on/off, SAM on/off. Games which are known to favor AMD, games which are known to favor Nvidia. Games which seemingly explicitly disfavor AMD more so than others (like Minecraft RTX).

    I haven't seen the hardware reviewers job look this complicated in a long, long ass time. Possibly not since the mid 90's when 3D was new, and seemingly every card has their own API with wildly different support.

    Yeah but they're not doing it because it's too complex, which is 100% fair (and why even in-depth sites like GN only have a few total benchmarks)

    They're not doing it because they think it's unfair

    jungleroomx on
  • Options
    Dr. ChaosDr. Chaos Post nuclear nuisance Registered User regular
    edited December 2020
    Thawmus wrote: »
    Namrok wrote: »
    Dr. Chaos wrote: »
    Dr. Chaos wrote: »
    Was wondering if you guys could help me figure out what I need to improve to get sixty fps in CP out of my desktop. Running into some surprisingly low performance in Cyberpunk.

    I play at 1600X900 resolution with a RX 580 8gb, 16gb RAM, AMD Ryzen 5 2600 Six-Core Processor and good NVME SSD but lowering /turning off settings (been reading optimization guides) doesn't really seem to be stopping it from dipping into the 20-45 range often.

    Does it look like I need an upgrade to get 60 FPS consistency out of it? Just bought this GPU less than six months ago but if I got to buy a new one to brute force 60 frames out of this thing on medium/high, I'll do it if I have to.

    Is it the CPU?

    I think it's both but mostly the GPU. A cursory glance through reddit shows 580s are running the game at low to medium at 1080p to get even close to 60fps, and those are with Ryzen 3xxx CPUs. Granted you're running a lower res, but I think hoping for 60fps and high settings on a 580 may be a bit much.
    Dr. Chaos wrote: »
    Was wondering if you guys could help me figure out what I need to improve to get sixty fps in CP out of my desktop. Running into some surprisingly low performance in Cyberpunk.

    I play at 1600X900 resolution with a RX 580 8gb, 16gb RAM, AMD Ryzen 5 2600 Six-Core Processor and good NVME SSD but lowering /turning off settings (been reading optimization guides) doesn't really seem to be stopping it from dipping into the 20-45 range often.

    Does it look like I need an upgrade to get 60 FPS consistency out of it? Just bought this GPU less than six months ago but if I got to buy a new one to brute force 60 frames out of this thing on medium/high, I'll do it if I have to.

    Is it the CPU?

    I think it's both but mostly the GPU. A cursory glance through reddit shows 580s are running the game at low to medium at 1080p to get even close to 60fps, and those are with Ryzen 3xxx CPUs. Granted you're running a lower res, but I think hoping for 60fps and high settings on a 580 may be a bit much.
    I see.

    Is there a GPU under or at 300 bucks you guys would reccomend thatll run it at my settings no problem?

    Should probably get something that will last me a few years of this gen atleast.


    Man, that's a rough question. The one you can get honestly.

    My local Microcenter lists no RTX 2000 or 3000 series cards in stock, or GTX 1600 series, or even GTX 1000 series! In fact, it looks like the best card they have is 2019's RX 5700 XT for $400. Looks like Newegg isn't any better.

    Even used last gen cards are reselling for MSRP still, the supply is so constrained. Every RTX 2070 Super near me is selling for $500 used. With a brand spanking new RTX 3060 TI selling for $400 and handily beating it's performance, you'd think people would price their used cards appropriately. But supply is that constrained.

    It's basically the worst time ever to be in the market for a GPU. I've never seen so few current gen, or last gen, cards on the market.

    All of this.

    Like, my price range is anywhere from $500-$1100 at this point and I can't get my mitts on a card at all.

    For $300 you're basically talking a last-gen card sometime after we work our way through this bot scalper bullshit, which most people are convinced we'll be out of in 2-3 months, which I think is super duper optimistic.
    Yikes.

    Guess I'll try looking into the $400 to $500 range and pray?

    Dr. Chaos on
    Pokemon GO: 7113 6338 6875/ FF14: Buckle Landrunner /Steam Profile
  • Options
    NamrokNamrok Registered User regular
    Dr. Chaos wrote: »
    Thawmus wrote: »
    Namrok wrote: »
    Dr. Chaos wrote: »
    Dr. Chaos wrote: »
    Was wondering if you guys could help me figure out what I need to improve to get sixty fps in CP out of my desktop. Running into some surprisingly low performance in Cyberpunk.

    I play at 1600X900 resolution with a RX 580 8gb, 16gb RAM, AMD Ryzen 5 2600 Six-Core Processor and good NVME SSD but lowering /turning off settings (been reading optimization guides) doesn't really seem to be stopping it from dipping into the 20-45 range often.

    Does it look like I need an upgrade to get 60 FPS consistency out of it? Just bought this GPU less than six months ago but if I got to buy a new one to brute force 60 frames out of this thing on medium/high, I'll do it if I have to.

    Is it the CPU?

    I think it's both but mostly the GPU. A cursory glance through reddit shows 580s are running the game at low to medium at 1080p to get even close to 60fps, and those are with Ryzen 3xxx CPUs. Granted you're running a lower res, but I think hoping for 60fps and high settings on a 580 may be a bit much.
    Dr. Chaos wrote: »
    Was wondering if you guys could help me figure out what I need to improve to get sixty fps in CP out of my desktop. Running into some surprisingly low performance in Cyberpunk.

    I play at 1600X900 resolution with a RX 580 8gb, 16gb RAM, AMD Ryzen 5 2600 Six-Core Processor and good NVME SSD but lowering /turning off settings (been reading optimization guides) doesn't really seem to be stopping it from dipping into the 20-45 range often.

    Does it look like I need an upgrade to get 60 FPS consistency out of it? Just bought this GPU less than six months ago but if I got to buy a new one to brute force 60 frames out of this thing on medium/high, I'll do it if I have to.

    Is it the CPU?

    I think it's both but mostly the GPU. A cursory glance through reddit shows 580s are running the game at low to medium at 1080p to get even close to 60fps, and those are with Ryzen 3xxx CPUs. Granted you're running a lower res, but I think hoping for 60fps and high settings on a 580 may be a bit much.
    I see.

    Is there a GPU under or at 300 bucks you guys would reccomend thatll run it at my settings no problem?

    Should probably get something that will last me a few years of this gen atleast.


    Man, that's a rough question. The one you can get honestly.

    My local Microcenter lists no RTX 2000 or 3000 series cards in stock, or GTX 1600 series, or even GTX 1000 series! In fact, it looks like the best card they have is 2019's RX 5700 XT for $400. Looks like Newegg isn't any better.

    Even used last gen cards are reselling for MSRP still, the supply is so constrained. Every RTX 2070 Super near me is selling for $500 used. With a brand spanking new RTX 3060 TI selling for $400 and handily beating it's performance, you'd think people would price their used cards appropriately. But supply is that constrained.

    It's basically the worst time ever to be in the market for a GPU. I've never seen so few current gen, or last gen, cards on the market.

    All of this.

    Like, my price range is anywhere from $500-$1100 at this point and I can't get my mitts on a card at all.

    For $300 you're basically talking a last-gen card sometime after we work our way through this bot scalper bullshit, which most people are convinced we'll be out of in 2-3 months, which I think is super duper optimistic.
    Yikes.

    Guess I'll try looking into the $400 to $500 range and pray?

    If it were me, I'd be patient and do other things with my life until the RTX 3060 TI is more available. I don't have the heart to navigate the highs and lows of attempting to get a GPU right now.

    Hopefully, being the lowest end of the new offerings, and being a binned RTX 3070, it's supply normalizes soonest.

  • Options
    jungleroomxjungleroomx It's never too many graves, it's always not enough shovels Registered User regular
    Dr. Chaos wrote: »
    Thawmus wrote: »
    Namrok wrote: »
    Dr. Chaos wrote: »
    Dr. Chaos wrote: »
    Was wondering if you guys could help me figure out what I need to improve to get sixty fps in CP out of my desktop. Running into some surprisingly low performance in Cyberpunk.

    I play at 1600X900 resolution with a RX 580 8gb, 16gb RAM, AMD Ryzen 5 2600 Six-Core Processor and good NVME SSD but lowering /turning off settings (been reading optimization guides) doesn't really seem to be stopping it from dipping into the 20-45 range often.

    Does it look like I need an upgrade to get 60 FPS consistency out of it? Just bought this GPU less than six months ago but if I got to buy a new one to brute force 60 frames out of this thing on medium/high, I'll do it if I have to.

    Is it the CPU?

    I think it's both but mostly the GPU. A cursory glance through reddit shows 580s are running the game at low to medium at 1080p to get even close to 60fps, and those are with Ryzen 3xxx CPUs. Granted you're running a lower res, but I think hoping for 60fps and high settings on a 580 may be a bit much.
    Dr. Chaos wrote: »
    Was wondering if you guys could help me figure out what I need to improve to get sixty fps in CP out of my desktop. Running into some surprisingly low performance in Cyberpunk.

    I play at 1600X900 resolution with a RX 580 8gb, 16gb RAM, AMD Ryzen 5 2600 Six-Core Processor and good NVME SSD but lowering /turning off settings (been reading optimization guides) doesn't really seem to be stopping it from dipping into the 20-45 range often.

    Does it look like I need an upgrade to get 60 FPS consistency out of it? Just bought this GPU less than six months ago but if I got to buy a new one to brute force 60 frames out of this thing on medium/high, I'll do it if I have to.

    Is it the CPU?

    I think it's both but mostly the GPU. A cursory glance through reddit shows 580s are running the game at low to medium at 1080p to get even close to 60fps, and those are with Ryzen 3xxx CPUs. Granted you're running a lower res, but I think hoping for 60fps and high settings on a 580 may be a bit much.
    I see.

    Is there a GPU under or at 300 bucks you guys would reccomend thatll run it at my settings no problem?

    Should probably get something that will last me a few years of this gen atleast.


    Man, that's a rough question. The one you can get honestly.

    My local Microcenter lists no RTX 2000 or 3000 series cards in stock, or GTX 1600 series, or even GTX 1000 series! In fact, it looks like the best card they have is 2019's RX 5700 XT for $400. Looks like Newegg isn't any better.

    Even used last gen cards are reselling for MSRP still, the supply is so constrained. Every RTX 2070 Super near me is selling for $500 used. With a brand spanking new RTX 3060 TI selling for $400 and handily beating it's performance, you'd think people would price their used cards appropriately. But supply is that constrained.

    It's basically the worst time ever to be in the market for a GPU. I've never seen so few current gen, or last gen, cards on the market.

    All of this.

    Like, my price range is anywhere from $500-$1100 at this point and I can't get my mitts on a card at all.

    For $300 you're basically talking a last-gen card sometime after we work our way through this bot scalper bullshit, which most people are convinced we'll be out of in 2-3 months, which I think is super duper optimistic.
    Yikes.

    Guess I'll try looking into the $400 to $500 range and pray?

    Just an FYI the holidays and Cyberpunk not being a flop are going to strain the shit out of the GPU supply lines.

    Getting a GPU now requires a lot of either

    A - Perseverance, refreshing sites and watching bots
    B - Patience, sitting in a queue for EVGA (Like I did for almost a month and a half)
    or
    C - Money, Paying a scalper for a used card

  • Options
    DuriniaDurinia Evolved from Space Potatoes Registered User regular
    I'm about to retire an EVGA GTX 670 (and i7-3770K) if anyone could use them.

    I should go dig around my storage room to see what else I've squirreled away. I'm sure there's something ancient and fun.

    For business reasons, I must preserve the outward sign of sanity.
    --Mark Twain
  • Options
    Dr. ChaosDr. Chaos Post nuclear nuisance Registered User regular
    I guess I'll wait for the 3060 ti to come in stock.

    Holy shit PC gaming has spoiled me on 60 fps. Got me by the balls.

    Its not something I used to think would matter to me but when you have it long enough, going back to 30 feels like moving underwater.

    Pokemon GO: 7113 6338 6875/ FF14: Buckle Landrunner /Steam Profile
  • Options
    3cl1ps33cl1ps3 I will build a labyrinth to house the cheese Registered User regular
    Dr. Chaos wrote: »
    I guess I'll wait for the 3060 ti to come in stock.

    Holy shit PC gaming has spoiled me on 60 fps. Got me by the balls.

    Its not something I used to think would matter to me but when you have it long enough, going back to 30 feels like moving underwater.

    This power is a blessing...and a curse.

  • Options
    NamrokNamrok Registered User regular
    Dr. Chaos wrote: »
    I guess I'll wait for the 3060 ti to come in stock.

    Holy shit PC gaming has spoiled me on 60 fps. Got me by the balls.

    Its not something I used to think would matter to me but when you have it long enough, going back to 30 feels like moving underwater.

    Locked 60 was nice. 100+ with freesync is truly a point of no return. Sold a buddy of mine on a 144hz monitor by telling him 100+ fps was like HD. Just like how HD made it harder to see individual pixels anymore, 100+ fps makes it harder to tell individual frames. It's a change in fidelity that's really difficult to go back from.

    And then you are forever trapped in $500+ GPUs.

  • Options
    CampyCampy Registered User regular
    That moment you manage to snag a 5800x from somewhere with actual stock so it's going to arrive at the same time as the rest of your components... :biggrin:

    That moment when you realise that your RAM has been underclocked for 4.5 years because your XMP settings were apparently borked by the manufacturer... :bigfrown:

  • Options
    DixonDixon Screwed...possibly doomed CanadaRegistered User regular
    edited December 2020
    I can see getting behind showing a review without RT, cause some people don't want that feature enabled.

    DLSS though doesn't have a downside, I don't notice an image quality difference when DLSS is set to Quality, and the performance gains are incredible.

    Any review not including at least that DLSS setting, is just trying to fan the flames.

    Dixon on
  • Options
    3cl1ps33cl1ps3 I will build a labyrinth to house the cheese Registered User regular
    Dixon wrote: »
    I can see getting behind showing a review with RT, cause some people don't want that feature enabled.

    DLSS though doesn't have a downside, I don't notice an image quality difference when DLSS is set to Quality, and the performance gains are incredible.

    Any review not including at least that DLSS setting, is just trying to fan the flames.

    Well it's also just like...people aren't going to not use these features if the card has them, you should be testing real world use cases.

  • Options
    jungleroomxjungleroomx It's never too many graves, it's always not enough shovels Registered User regular
    Raytracing isn't going away, either. It's not like HairFX, stereoscopic 3d, or PhysX cards.

  • Options
    IncindiumIncindium Registered User regular
    edited December 2020
    Namrok wrote: »
    3clipse wrote: »
    3clipse wrote: »
    3clipse wrote: »
    Thawmus wrote: »
    I mean GN released a video review yesterday of CP2077 performance across all the cards, RTX on and off, and frankly most of the review was with RTX off, so it raises some questions of just how much HUB ignores RTX performance.

    Looking at HUB's CP2077 performance review, at a glance it doesn't look like they benchmarked RTX performance at all. I'm just going off of the jumps in the video, please correct me if I'm wrong.

    This is correct, they don't do RT or DLSS often/at all because it's not "equal" with AMD

    I think it's fairly reasonable for a company that's providing them with thousands of dollars of hardware for free to ask they actually benchmark what the hardware was built for.

    Well that's extremely stupid, these are real features that exist and which people use, they can't just be ignored to achieve some kind of fake parity.

    HUB seems to think so.

    I stopped watching them a little bit ago because they started whining about their viewers wanting to at least SEE DLSS 2.0 numbers, and dude steadfastly refused. Which is nuts because it's a hardware feature!

    They don't do AMD's direct access memory either!

    whyyyyyyy

    Refusal to move on from raster-only benchmarks, which are starting down the same path single-core CPU performance metrics have gone.

    It's the children who are yadda yadda

    A part of me is sympathetic to trying to stick to apples to apples raster comparisons. I think it's wrong. But I'm sympathetic. The feature sets of AMD and Nvidia cards are so divergent at this point, the matrix of benchmarks you need to run to get a comprehensive overview of their capabilities is really exploding in complexity. RTX on/off, DLSS on/off, Rage Mode on/off, SAM on/off. Games which are known to favor AMD, games which are known to favor Nvidia. Games which seemingly explicitly disfavor AMD more so than others (like Minecraft RTX).

    I haven't seen the hardware reviewers job look this complicated in a long, long ass time. Possibly not since the mid 90's when 3D was new, and seemingly every card has their own API with wildly different support.

    Yeah but they're not doing it because it's too complex, which is 100% fair (and why even in-depth sites like GN only have a few total benchmarks)

    They're not doing it because they think it's unfair

    That is actually pretty much a mischaracterization by you. HUB aren't prioritizing raytracing/dlss testing in initial reviews because the number of games that support them are relatively small still and the number of people with cards that can use those features right now is also small.

    Also DLSS quality depending on mode used isn't a cut and dry comparison of like per like at a resolution which further complicates things.

    That said there is a video coming tomorrow on DLSS and Raytracing performance from HUB about Cyberpunk 2077...

    Hardware Unboxed in my opinion is doing the best computer hardware technology review work among YouTubers right now and their coverage of the 3080 series has been what made me want and waste a bunch of time to get one and its really stupid of Nvidia to take them off the review sample list.

    I don't always agree with their conclusions but they are presenting the data objectively in a fashion that lets me come to my own conclusions based upon my own biases and budget and wants.

    Incindium on
    steam_sig.png
    Nintendo ID: Incindium
    PSN: IncindiumX
  • Options
    Dr. ChaosDr. Chaos Post nuclear nuisance Registered User regular
    edited December 2020
    As you guys might remember from the last few times I've popped my head in here over the years, I'm pretty component illiterate.

    One thing I've always wondered is how much does it really matter which brand of graphics card you buy?

    Like the RTX 3060 ti for example, theres Asus, EVGA, Gigabyte, etc.

    Does it not really matter as long as I stick to the more established companies? Thats what I've done in the past.

    Dr. Chaos on
    Pokemon GO: 7113 6338 6875/ FF14: Buckle Landrunner /Steam Profile
  • Options
    DehumanizedDehumanized Registered User regular
    Dr. Chaos wrote: »
    As you guys might remember from the last few times I've popped my head in here over the years, I'm pretty component illiterate.

    One thing I've always wondered is how much does it really matter which brand of graphics card you buy?

    Like the RTX 3060 ti for example, theres Asus, EVGA, Gigabyte, etc.

    Does it not really matter as long as I stick to the more established companies? Thats what I've done in the past.

    in terms of gaming framerates, it doesn't really matter. a 3060ti is a 3060ti, and they'll all perform within a few percentage points of each other

  • Options
    V1mV1m Registered User regular
    Thawmus wrote: »
    Namrok wrote: »
    Dr. Chaos wrote: »
    Dr. Chaos wrote: »
    Was wondering if you guys could help me figure out what I need to improve to get sixty fps in CP out of my desktop. Running into some surprisingly low performance in Cyberpunk.

    I play at 1600X900 resolution with a RX 580 8gb, 16gb RAM, AMD Ryzen 5 2600 Six-Core Processor and good NVME SSD but lowering /turning off settings (been reading optimization guides) doesn't really seem to be stopping it from dipping into the 20-45 range often.

    Does it look like I need an upgrade to get 60 FPS consistency out of it? Just bought this GPU less than six months ago but if I got to buy a new one to brute force 60 frames out of this thing on medium/high, I'll do it if I have to.

    Is it the CPU?

    I think it's both but mostly the GPU. A cursory glance through reddit shows 580s are running the game at low to medium at 1080p to get even close to 60fps, and those are with Ryzen 3xxx CPUs. Granted you're running a lower res, but I think hoping for 60fps and high settings on a 580 may be a bit much.
    Dr. Chaos wrote: »
    Was wondering if you guys could help me figure out what I need to improve to get sixty fps in CP out of my desktop. Running into some surprisingly low performance in Cyberpunk.

    I play at 1600X900 resolution with a RX 580 8gb, 16gb RAM, AMD Ryzen 5 2600 Six-Core Processor and good NVME SSD but lowering /turning off settings (been reading optimization guides) doesn't really seem to be stopping it from dipping into the 20-45 range often.

    Does it look like I need an upgrade to get 60 FPS consistency out of it? Just bought this GPU less than six months ago but if I got to buy a new one to brute force 60 frames out of this thing on medium/high, I'll do it if I have to.

    Is it the CPU?

    I think it's both but mostly the GPU. A cursory glance through reddit shows 580s are running the game at low to medium at 1080p to get even close to 60fps, and those are with Ryzen 3xxx CPUs. Granted you're running a lower res, but I think hoping for 60fps and high settings on a 580 may be a bit much.
    I see.

    Is there a GPU under or at 300 bucks you guys would reccomend thatll run it at my settings no problem?

    Should probably get something that will last me a few years of this gen atleast.


    Man, that's a rough question. The one you can get honestly.

    My local Microcenter lists no RTX 2000 or 3000 series cards in stock, or GTX 1600 series, or even GTX 1000 series! In fact, it looks like the best card they have is 2019's RX 5700 XT for $400. Looks like Newegg isn't any better.

    Even used last gen cards are reselling for MSRP still, the supply is so constrained. Every RTX 2070 Super near me is selling for $500 used. With a brand spanking new RTX 3060 TI selling for $400 and handily beating it's performance, you'd think people would price their used cards appropriately. But supply is that constrained.

    It's basically the worst time ever to be in the market for a GPU. I've never seen so few current gen, or last gen, cards on the market.

    All of this.

    Like, my price range is anywhere from $500-$1100 at this point and I can't get my mitts on a card at all.

    For $300 you're basically talking a last-gen card sometime after we work our way through this bot scalper bullshit, which most people are convinced we'll be out of in 2-3 months, which I think is super duper optimistic.

    Supply will start to meet demand in 2-3 months. It'll be June before things are back to what I laughingly call 'normal' while swilling down Xanax with gin

  • Options
    danxdanx Registered User regular
    edited December 2020
    The company matters mostly for the Warranty. Some have poor RMA service like Gigabyte (with Mobos specifically, idk about their gpus) while others are great. Build quality can vary between manufacturers a lot and which gpu vendor but they're all pretty decent usually.

    There's a list of solid partners for each gpu vendor but I'm not sure it matters much at the moment given supply and demand are so tight. It's easier to tell you which to avoid.

    I wouldn't touch KFA, XFX and maybe PNY.

    danx on
  • Options
    Dr. ChaosDr. Chaos Post nuclear nuisance Registered User regular
    edited December 2020
    Ah fuck me. Forgot about bot scalpers.

    As if I wasn't getting screwed enough by those guys while looking for a PS5.

    Dr. Chaos on
    Pokemon GO: 7113 6338 6875/ FF14: Buckle Landrunner /Steam Profile
  • Options
    useruser Registered User regular
    So hey -- Right now I've got a 2080Ti and I'm comfortable with that performance, but I've been using it on a 2700X on an x470 board -- so I think it's worth considering upgrading at least my CPU, if only because it would get me a little bit better performance with my current GPU and better match an upgrade for a new GPU a little down the line.

    Problem is -- I have no idea how to stay on top of stock for 5900Xs -- how do?

  • Options
    V1mV1m Registered User regular
    Honestly, you have a good, if not cutting edge, 8c/16t CPU. You're not stuck with a 1600 or anything. It's a goddamb lot of fucking work buying a Zen3 at the moment. Unless you have a lot of free time and a high tolerance for frustration, my advice is just leave it for a month or two. Wait 'till the Christmas frenzy has died down, the supply pipelines have had a chance to flow and shit has calmed down a bit.

  • Options
    ThawmusThawmus +Jackface Registered User regular
    V1m wrote: »
    Thawmus wrote: »
    Namrok wrote: »
    Dr. Chaos wrote: »
    Dr. Chaos wrote: »
    Was wondering if you guys could help me figure out what I need to improve to get sixty fps in CP out of my desktop. Running into some surprisingly low performance in Cyberpunk.

    I play at 1600X900 resolution with a RX 580 8gb, 16gb RAM, AMD Ryzen 5 2600 Six-Core Processor and good NVME SSD but lowering /turning off settings (been reading optimization guides) doesn't really seem to be stopping it from dipping into the 20-45 range often.

    Does it look like I need an upgrade to get 60 FPS consistency out of it? Just bought this GPU less than six months ago but if I got to buy a new one to brute force 60 frames out of this thing on medium/high, I'll do it if I have to.

    Is it the CPU?

    I think it's both but mostly the GPU. A cursory glance through reddit shows 580s are running the game at low to medium at 1080p to get even close to 60fps, and those are with Ryzen 3xxx CPUs. Granted you're running a lower res, but I think hoping for 60fps and high settings on a 580 may be a bit much.
    Dr. Chaos wrote: »
    Was wondering if you guys could help me figure out what I need to improve to get sixty fps in CP out of my desktop. Running into some surprisingly low performance in Cyberpunk.

    I play at 1600X900 resolution with a RX 580 8gb, 16gb RAM, AMD Ryzen 5 2600 Six-Core Processor and good NVME SSD but lowering /turning off settings (been reading optimization guides) doesn't really seem to be stopping it from dipping into the 20-45 range often.

    Does it look like I need an upgrade to get 60 FPS consistency out of it? Just bought this GPU less than six months ago but if I got to buy a new one to brute force 60 frames out of this thing on medium/high, I'll do it if I have to.

    Is it the CPU?

    I think it's both but mostly the GPU. A cursory glance through reddit shows 580s are running the game at low to medium at 1080p to get even close to 60fps, and those are with Ryzen 3xxx CPUs. Granted you're running a lower res, but I think hoping for 60fps and high settings on a 580 may be a bit much.
    I see.

    Is there a GPU under or at 300 bucks you guys would reccomend thatll run it at my settings no problem?

    Should probably get something that will last me a few years of this gen atleast.


    Man, that's a rough question. The one you can get honestly.

    My local Microcenter lists no RTX 2000 or 3000 series cards in stock, or GTX 1600 series, or even GTX 1000 series! In fact, it looks like the best card they have is 2019's RX 5700 XT for $400. Looks like Newegg isn't any better.

    Even used last gen cards are reselling for MSRP still, the supply is so constrained. Every RTX 2070 Super near me is selling for $500 used. With a brand spanking new RTX 3060 TI selling for $400 and handily beating it's performance, you'd think people would price their used cards appropriately. But supply is that constrained.

    It's basically the worst time ever to be in the market for a GPU. I've never seen so few current gen, or last gen, cards on the market.

    All of this.

    Like, my price range is anywhere from $500-$1100 at this point and I can't get my mitts on a card at all.

    For $300 you're basically talking a last-gen card sometime after we work our way through this bot scalper bullshit, which most people are convinced we'll be out of in 2-3 months, which I think is super duper optimistic.

    Supply will start to meet demand in 2-3 months. It'll be June before things are back to what I laughingly call 'normal' while swilling down Xanax with gin

    I think supply meeting demand doesn't matter when the suppliers are still going to be scalpers. Snatching up literally all of the supply and still selling the stuff at 200% margins so that they can maintain their stranglehold of the market is still going to be the name of the game. And they have zero cashflow problems since people keep buying from scalpers.

    r1spdlf9yxhd.png

    Extremely willing to be proven wrong. Willing to be the dumbest dumb in all of dumbland if I can just be wrong about this one thing.

    Twitch: Thawmus83
  • Options
    Dr. ChaosDr. Chaos Post nuclear nuisance Registered User regular

    Thawmus wrote: »
    V1m wrote: »
    Thawmus wrote: »
    Namrok wrote: »
    Dr. Chaos wrote: »
    Dr. Chaos wrote: »
    Was wondering if you guys could help me figure out what I need to improve to get sixty fps in CP out of my desktop. Running into some surprisingly low performance in Cyberpunk.

    I play at 1600X900 resolution with a RX 580 8gb, 16gb RAM, AMD Ryzen 5 2600 Six-Core Processor and good NVME SSD but lowering /turning off settings (been reading optimization guides) doesn't really seem to be stopping it from dipping into the 20-45 range often.

    Does it look like I need an upgrade to get 60 FPS consistency out of it? Just bought this GPU less than six months ago but if I got to buy a new one to brute force 60 frames out of this thing on medium/high, I'll do it if I have to.

    Is it the CPU?

    I think it's both but mostly the GPU. A cursory glance through reddit shows 580s are running the game at low to medium at 1080p to get even close to 60fps, and those are with Ryzen 3xxx CPUs. Granted you're running a lower res, but I think hoping for 60fps and high settings on a 580 may be a bit much.
    Dr. Chaos wrote: »
    Was wondering if you guys could help me figure out what I need to improve to get sixty fps in CP out of my desktop. Running into some surprisingly low performance in Cyberpunk.

    I play at 1600X900 resolution with a RX 580 8gb, 16gb RAM, AMD Ryzen 5 2600 Six-Core Processor and good NVME SSD but lowering /turning off settings (been reading optimization guides) doesn't really seem to be stopping it from dipping into the 20-45 range often.

    Does it look like I need an upgrade to get 60 FPS consistency out of it? Just bought this GPU less than six months ago but if I got to buy a new one to brute force 60 frames out of this thing on medium/high, I'll do it if I have to.

    Is it the CPU?

    I think it's both but mostly the GPU. A cursory glance through reddit shows 580s are running the game at low to medium at 1080p to get even close to 60fps, and those are with Ryzen 3xxx CPUs. Granted you're running a lower res, but I think hoping for 60fps and high settings on a 580 may be a bit much.
    I see.

    Is there a GPU under or at 300 bucks you guys would reccomend thatll run it at my settings no problem?

    Should probably get something that will last me a few years of this gen atleast.


    Man, that's a rough question. The one you can get honestly.

    My local Microcenter lists no RTX 2000 or 3000 series cards in stock, or GTX 1600 series, or even GTX 1000 series! In fact, it looks like the best card they have is 2019's RX 5700 XT for $400. Looks like Newegg isn't any better.

    Even used last gen cards are reselling for MSRP still, the supply is so constrained. Every RTX 2070 Super near me is selling for $500 used. With a brand spanking new RTX 3060 TI selling for $400 and handily beating it's performance, you'd think people would price their used cards appropriately. But supply is that constrained.

    It's basically the worst time ever to be in the market for a GPU. I've never seen so few current gen, or last gen, cards on the market.

    All of this.

    Like, my price range is anywhere from $500-$1100 at this point and I can't get my mitts on a card at all.

    For $300 you're basically talking a last-gen card sometime after we work our way through this bot scalper bullshit, which most people are convinced we'll be out of in 2-3 months, which I think is super duper optimistic.

    Supply will start to meet demand in 2-3 months. It'll be June before things are back to what I laughingly call 'normal' while swilling down Xanax with gin

    I think supply meeting demand doesn't matter when the suppliers are still going to be scalpers. Snatching up literally all of the supply and still selling the stuff at 200% margins so that they can maintain their stranglehold of the market is still going to be the name of the game. And they have zero cashflow problems since people keep buying from scalpers.

    r1spdlf9yxhd.png

    Extremely willing to be proven wrong. Willing to be the dumbest dumb in all of dumbland if I can just be wrong about this one thing.
    This is so fucked.

    I know we just got to wait it out but ugh.

    Pokemon GO: 7113 6338 6875/ FF14: Buckle Landrunner /Steam Profile
  • Options
    V1mV1m Registered User regular
    Thawmus wrote: »
    V1m wrote: »
    Thawmus wrote: »
    Namrok wrote: »
    Dr. Chaos wrote: »
    Dr. Chaos wrote: »
    Was wondering if you guys could help me figure out what I need to improve to get sixty fps in CP out of my desktop. Running into some surprisingly low performance in Cyberpunk.

    I play at 1600X900 resolution with a RX 580 8gb, 16gb RAM, AMD Ryzen 5 2600 Six-Core Processor and good NVME SSD but lowering /turning off settings (been reading optimization guides) doesn't really seem to be stopping it from dipping into the 20-45 range often.

    Does it look like I need an upgrade to get 60 FPS consistency out of it? Just bought this GPU less than six months ago but if I got to buy a new one to brute force 60 frames out of this thing on medium/high, I'll do it if I have to.

    Is it the CPU?

    I think it's both but mostly the GPU. A cursory glance through reddit shows 580s are running the game at low to medium at 1080p to get even close to 60fps, and those are with Ryzen 3xxx CPUs. Granted you're running a lower res, but I think hoping for 60fps and high settings on a 580 may be a bit much.
    Dr. Chaos wrote: »
    Was wondering if you guys could help me figure out what I need to improve to get sixty fps in CP out of my desktop. Running into some surprisingly low performance in Cyberpunk.

    I play at 1600X900 resolution with a RX 580 8gb, 16gb RAM, AMD Ryzen 5 2600 Six-Core Processor and good NVME SSD but lowering /turning off settings (been reading optimization guides) doesn't really seem to be stopping it from dipping into the 20-45 range often.

    Does it look like I need an upgrade to get 60 FPS consistency out of it? Just bought this GPU less than six months ago but if I got to buy a new one to brute force 60 frames out of this thing on medium/high, I'll do it if I have to.

    Is it the CPU?

    I think it's both but mostly the GPU. A cursory glance through reddit shows 580s are running the game at low to medium at 1080p to get even close to 60fps, and those are with Ryzen 3xxx CPUs. Granted you're running a lower res, but I think hoping for 60fps and high settings on a 580 may be a bit much.
    I see.

    Is there a GPU under or at 300 bucks you guys would reccomend thatll run it at my settings no problem?

    Should probably get something that will last me a few years of this gen atleast.


    Man, that's a rough question. The one you can get honestly.

    My local Microcenter lists no RTX 2000 or 3000 series cards in stock, or GTX 1600 series, or even GTX 1000 series! In fact, it looks like the best card they have is 2019's RX 5700 XT for $400. Looks like Newegg isn't any better.

    Even used last gen cards are reselling for MSRP still, the supply is so constrained. Every RTX 2070 Super near me is selling for $500 used. With a brand spanking new RTX 3060 TI selling for $400 and handily beating it's performance, you'd think people would price their used cards appropriately. But supply is that constrained.

    It's basically the worst time ever to be in the market for a GPU. I've never seen so few current gen, or last gen, cards on the market.

    All of this.

    Like, my price range is anywhere from $500-$1100 at this point and I can't get my mitts on a card at all.

    For $300 you're basically talking a last-gen card sometime after we work our way through this bot scalper bullshit, which most people are convinced we'll be out of in 2-3 months, which I think is super duper optimistic.

    Supply will start to meet demand in 2-3 months. It'll be June before things are back to what I laughingly call 'normal' while swilling down Xanax with gin

    I think supply meeting demand doesn't matter when the suppliers are still going to be scalpers. Snatching up literally all of the supply and still selling the stuff at 200% margins so that they can maintain their stranglehold of the market is still going to be the name of the game. And they have zero cashflow problems since people keep buying from scalpers.

    r1spdlf9yxhd.png

    Extremely willing to be proven wrong. Willing to be the dumbest dumb in all of dumbland if I can just be wrong about this one thing.

    I flat out don't need or want a video card enough to pay that kind of money for a second hand card with no warranty and a non-zero risk of being scammed, and there are plenty like me. The scalpers will run out of people willing to pay $1500 for a $700 card loooong before people who are already salty abut having to pay even $650 are ready to pay $1500, or even $850 for a stranger's word on what card they're sending.

  • Options
    OrcaOrca Also known as Espressosaurus WrexRegistered User regular
    I can wait out this generation if I have to. I don't want to, but I can. NBD. No way I'm spending that much dosh with that few guarantees.

  • Options
    rahkeesh2000rahkeesh2000 Registered User regular
    edited December 2020
    3clipse wrote: »
    3clipse wrote: »
    3clipse wrote: »
    Thawmus wrote: »
    I mean GN released a video review yesterday of CP2077 performance across all the cards, RTX on and off, and frankly most of the review was with RTX off, so it raises some questions of just how much HUB ignores RTX performance.

    Looking at HUB's CP2077 performance review, at a glance it doesn't look like they benchmarked RTX performance at all. I'm just going off of the jumps in the video, please correct me if I'm wrong.

    This is correct, they don't do RT or DLSS often/at all because it's not "equal" with AMD

    I think it's fairly reasonable for a company that's providing them with thousands of dollars of hardware for free to ask they actually benchmark what the hardware was built for.

    Well that's extremely stupid, these are real features that exist and which people use, they can't just be ignored to achieve some kind of fake parity.

    HUB seems to think so.

    I stopped watching them a little bit ago because they started whining about their viewers wanting to at least SEE DLSS 2.0 numbers, and dude steadfastly refused. Which is nuts because it's a hardware feature!

    They don't do AMD's direct access memory either!

    whyyyyyyy

    Refusal to move on from raster-only benchmarks, which are starting down the same path single-core CPU performance metrics have gone.

    It's the children who are yadda yadda

    I also see a lot of people in the comments who still think raytracing/pathing is a gimmick and not the eventual replacement of current lighting systems.

    This is part of why HWU does what they do. Raytracing is still considered a useless feature for a ton of their audience. It is the future but it's barely here right now. Steve believes that even with the best graphics cards, exactly one game is worth turning on raytracing for at the moment. So while there's good argument that they should still prioritize that data somewhat for those who want the benchmarks, its also not something that is going to impact the bottom line of their reviews.

    rahkeesh2000 on
  • Options
    Pixelated PixiePixelated Pixie They/Them Registered User regular
    3clipse wrote: »
    3clipse wrote: »
    3clipse wrote: »
    Thawmus wrote: »
    I mean GN released a video review yesterday of CP2077 performance across all the cards, RTX on and off, and frankly most of the review was with RTX off, so it raises some questions of just how much HUB ignores RTX performance.

    Looking at HUB's CP2077 performance review, at a glance it doesn't look like they benchmarked RTX performance at all. I'm just going off of the jumps in the video, please correct me if I'm wrong.

    This is correct, they don't do RT or DLSS often/at all because it's not "equal" with AMD

    I think it's fairly reasonable for a company that's providing them with thousands of dollars of hardware for free to ask they actually benchmark what the hardware was built for.

    Well that's extremely stupid, these are real features that exist and which people use, they can't just be ignored to achieve some kind of fake parity.

    HUB seems to think so.

    I stopped watching them a little bit ago because they started whining about their viewers wanting to at least SEE DLSS 2.0 numbers, and dude steadfastly refused. Which is nuts because it's a hardware feature!

    They don't do AMD's direct access memory either!

    whyyyyyyy

    Refusal to move on from raster-only benchmarks, which are starting down the same path single-core CPU performance metrics have gone.

    It's the children who are yadda yadda

    I also see a lot of people in the comments who still think raytracing/pathing is a gimmick and not the eventual replacement of current lighting systems.

    This is part of why HWU does what they do. Raytracing is still considered a useless feature for a ton of their audience. It is the future but it's barely here right now. Steve believes that even with the best graphics cards, exactly one game is worth turning on raytracing for at the moment. So while there's good argument that they should still prioritize that data somewhat for those who want the benchmarks, its also not something that is going to impact the bottom line of their reviews.


    ... sounds like Steve is mistaken! :razz:

    ~~ Pixie on Steam ~~
    ironzerg wrote: »
    Chipmunks are like nature's nipple clamps, I guess?
  • Options
    V1mV1m Registered User regular
    What are the others?

  • Options
    V1mV1m Registered User regular
    Orca wrote: »
    I can wait out this generation if I have to. I don't want to, but I can. NBD. No way I'm spending that much dosh with that few guarantees.

    Yeah RDNA 3 will be out in a year, and that won't be competing for fab space with not one but two new consoles. I'm not a teenager; I can wait if the alternative is to knowingly submit to being fleeced. Me and my self-respect will be over here, working through my Steam back catalogue.

  • Options
    ThawmusThawmus +Jackface Registered User regular
    Orca wrote: »
    I can wait out this generation if I have to. I don't want to, but I can. NBD. No way I'm spending that much dosh with that few guarantees.

    I can't.

    That doesn't mean scalpers and scalper bots are going to get my money, but it also doesn't mean I'm willing to hold out for 2-3 months just to find out things are still terrible and you still have to be subscribed to 20 different alerts to get cards. All that will mean is 2-3 months of buying opportunities wasted.

    Twitch: Thawmus83
  • Options
    CaedwyrCaedwyr Registered User regular
    V1m wrote: »
    What are the others?

    I believe the joke is that Ray Tracing isn't working so hot in Metro 2077 at the moment.

  • Options
    DixonDixon Screwed...possibly doomed CanadaRegistered User regular
    edited December 2020
    It shouldn’t even be that complicated, if they are showing benchmarks for a game that support those features it should be turned on.

    If anything just show benches for both.

    There are quite a few RT and DLSS games, a cursory google search shows more then 15 titles for each feature, and there are much more on the docket.

    When you buy a high end card you want it to last at least a couple years. You should know what you are getting with games coming out in that time frame also.

    Like I was saying RT is a big deal, but personally it takes a back seat to DLSS. A feature that just gives you such a boost to performance and in some cases a visual boost is just bonkers.

    Reviewers like this just have their bias and want to keep their head in the sand.

    It’s not like when AMD catches up in that tech they are still going to disable that feature.

    Dixon on
  • Options
    expendableexpendable Silly Goose Registered User regular
    I spent a significant portion of this year just trying to buy one particular 1080p monitor at MSRP because as soon as they'd pop up on Amazon they'd be bought and relisted at 300% markup.

    Though it was amusing when it seemed like two 3rd party sellers were just buying the entire stock from eachother and relisting it higher. Scalper A would have the only 5 in stock at 200% MSRP, then later Scalper B would have the only 5 in at 225%, then Scalper A would have the only 5 in stock at 275%, and so on.

    I eventually got super lucky and snagged one at MSRP. I'm not even going to try on anything until after the new year and in my heart of hearts, I know before March is pointless.

    Djiem wrote: »
    Lokiamis wrote: »
    So the servers suddenly decide to cramp up during the last six percent.
    Man, the Director will really go out of his way to be a dick to L4D players.
    Steam
  • Options
    jungleroomxjungleroomx It's never too many graves, it's always not enough shovels Registered User regular
    edited December 2020
    Incindium wrote: »
    Namrok wrote: »
    3clipse wrote: »
    3clipse wrote: »
    3clipse wrote: »
    Thawmus wrote: »
    I mean GN released a video review yesterday of CP2077 performance across all the cards, RTX on and off, and frankly most of the review was with RTX off, so it raises some questions of just how much HUB ignores RTX performance.

    Looking at HUB's CP2077 performance review, at a glance it doesn't look like they benchmarked RTX performance at all. I'm just going off of the jumps in the video, please correct me if I'm wrong.

    This is correct, they don't do RT or DLSS often/at all because it's not "equal" with AMD

    I think it's fairly reasonable for a company that's providing them with thousands of dollars of hardware for free to ask they actually benchmark what the hardware was built for.

    Well that's extremely stupid, these are real features that exist and which people use, they can't just be ignored to achieve some kind of fake parity.

    HUB seems to think so.

    I stopped watching them a little bit ago because they started whining about their viewers wanting to at least SEE DLSS 2.0 numbers, and dude steadfastly refused. Which is nuts because it's a hardware feature!

    They don't do AMD's direct access memory either!

    whyyyyyyy

    Refusal to move on from raster-only benchmarks, which are starting down the same path single-core CPU performance metrics have gone.

    It's the children who are yadda yadda

    A part of me is sympathetic to trying to stick to apples to apples raster comparisons. I think it's wrong. But I'm sympathetic. The feature sets of AMD and Nvidia cards are so divergent at this point, the matrix of benchmarks you need to run to get a comprehensive overview of their capabilities is really exploding in complexity. RTX on/off, DLSS on/off, Rage Mode on/off, SAM on/off. Games which are known to favor AMD, games which are known to favor Nvidia. Games which seemingly explicitly disfavor AMD more so than others (like Minecraft RTX).

    I haven't seen the hardware reviewers job look this complicated in a long, long ass time. Possibly not since the mid 90's when 3D was new, and seemingly every card has their own API with wildly different support.

    Yeah but they're not doing it because it's too complex, which is 100% fair (and why even in-depth sites like GN only have a few total benchmarks)

    They're not doing it because they think it's unfair

    That is actually pretty much a mischaracterization by you. HUB aren't prioritizing raytracing/dlss testing in initial reviews because the number of games that support them are relatively small still and the number of people with cards that can use those features right now is also small.

    Also DLSS quality depending on mode used isn't a cut and dry comparison of like per like at a resolution which further complicates things.

    That said there is a video coming tomorrow on DLSS and Raytracing performance from HUB about Cyberpunk 2077...

    Hardware Unboxed in my opinion is doing the best computer hardware technology review work among YouTubers right now and their coverage of the 3080 series has been what made me want and waste a bunch of time to get one and its really stupid of Nvidia to take them off the review sample list.

    I don't always agree with their conclusions but they are presenting the data objectively in a fashion that lets me come to my own conclusions based upon my own biases and budget and wants.

    I mean, nah.

    They've said they only want apples to apples testing of video cards, and you're saying the exact same thing they're saying:
    Also DLSS quality depending on mode used isn't a cut and dry comparison of like per like at a resolution which further complicates things.

    Easy fix. Do them all, have a screenshot of the worst feathering, call it a review. Let your users decide instead of deciding for them what they should care about. Or do one setting, allowing your viewers to extrapolate how that scales. Not doing it because it's complicated is kind of silly.

    And it's great they have DLSS and RT in a CP video, but they should have them in all of their GPU reviews because it's tech that's going to be around from now on, and it's a part of why people are spending their money.

    They're flagship features, they're in demand as has been pretty readily proven, and if I'm going to watch a tech reviewer it's not gonna be some dude with his head up his ass about new tech, instead opting to play Mr. Gatekeeper McPurityTest.

    And refusing to test my hardware's flagship features because of arbitrary standards would get you yanked off my freebie list as well.

    jungleroomx on
  • Options
    jungleroomxjungleroomx It's never too many graves, it's always not enough shovels Registered User regular
    edited December 2020
    Like, it's fine if you want to tell a tech company to go fuck itself with regards to reviewing the biggest unit moving features of their product. As a review channel, that's absolutely your right and there's nothing wrong with it empirically.

    But don't expect free shit if you do so.

    jungleroomx on
  • Options
    htmhtm Registered User regular
    3clipse wrote: »
    3clipse wrote: »
    3clipse wrote: »
    Thawmus wrote: »
    I mean GN released a video review yesterday of CP2077 performance across all the cards, RTX on and off, and frankly most of the review was with RTX off, so it raises some questions of just how much HUB ignores RTX performance.

    Looking at HUB's CP2077 performance review, at a glance it doesn't look like they benchmarked RTX performance at all. I'm just going off of the jumps in the video, please correct me if I'm wrong.

    This is correct, they don't do RT or DLSS often/at all because it's not "equal" with AMD

    I think it's fairly reasonable for a company that's providing them with thousands of dollars of hardware for free to ask they actually benchmark what the hardware was built for.

    Well that's extremely stupid, these are real features that exist and which people use, they can't just be ignored to achieve some kind of fake parity.

    HUB seems to think so.

    I stopped watching them a little bit ago because they started whining about their viewers wanting to at least SEE DLSS 2.0 numbers, and dude steadfastly refused. Which is nuts because it's a hardware feature!

    They don't do AMD's direct access memory either!

    whyyyyyyy

    Refusal to move on from raster-only benchmarks, which are starting down the same path single-core CPU performance metrics have gone.

    It's the children who are yadda yadda

    I also see a lot of people in the comments who still think raytracing/pathing is a gimmick and not the eventual replacement of current lighting systems.

    This is part of why HWU does what they do. Raytracing is still considered a useless feature for a ton of their audience. It is the future but it's barely here right now. Steve believes that even with the best graphics cards, exactly one game is worth turning on raytracing for at the moment. So while there's good argument that they should still prioritize that data somewhat for those who want the benchmarks, its also not something that is going to impact the bottom line of their reviews.

    Do they benchmark raster @ 4K? 4K is, by that logic, also a useless feature. After all, the vast majority of their audience doesn't have a 4K display. And I bet by next summer, a very significant percentage of PC gamers will have at least one game in their libraries that supports ray tracing. Far, far more than will be gaming on 4K displays.

    Also, the future is the most important criterion for buying a GPU, especially this year, when the consoles are gaining ray-tracing and many new AAA titles are therefore going to support it.

    Finally, why can't they just run both raster and ray-tracing benchmarks? Most other hardware review sites manage to do both. I'm not saying Nvidia is right to blacklist HUB, but HUB's philosophy of GPU reviews is weirdly idiosyncratic.

  • Options
    Dr. ChaosDr. Chaos Post nuclear nuisance Registered User regular
    Was long past overdue for a PSU upgrade so i treated myself to a Thermaltake Toughpower GF1 750w on Amazon. Reviews seemed decent enough.

    Ive had a 500w for as long as I can remember, 650w was what I was initially looking to upgrade but eh, fuck it. Never know, right?

    Pokemon GO: 7113 6338 6875/ FF14: Buckle Landrunner /Steam Profile
  • Options
    IncindiumIncindium Registered User regular
    edited December 2020
    Incindium wrote: »
    Namrok wrote: »
    3clipse wrote: »
    3clipse wrote: »
    3clipse wrote: »
    Thawmus wrote: »
    I mean GN released a video review yesterday of CP2077 performance across all the cards, RTX on and off, and frankly most of the review was with RTX off, so it raises some questions of just how much HUB ignores RTX performance.

    Looking at HUB's CP2077 performance review, at a glance it doesn't look like they benchmarked RTX performance at all. I'm just going off of the jumps in the video, please correct me if I'm wrong.

    This is correct, they don't do RT or DLSS often/at all because it's not "equal" with AMD

    I think it's fairly reasonable for a company that's providing them with thousands of dollars of hardware for free to ask they actually benchmark what the hardware was built for.

    Well that's extremely stupid, these are real features that exist and which people use, they can't just be ignored to achieve some kind of fake parity.

    HUB seems to think so.

    I stopped watching them a little bit ago because they started whining about their viewers wanting to at least SEE DLSS 2.0 numbers, and dude steadfastly refused. Which is nuts because it's a hardware feature!

    They don't do AMD's direct access memory either!

    whyyyyyyy

    Refusal to move on from raster-only benchmarks, which are starting down the same path single-core CPU performance metrics have gone.

    It's the children who are yadda yadda

    A part of me is sympathetic to trying to stick to apples to apples raster comparisons. I think it's wrong. But I'm sympathetic. The feature sets of AMD and Nvidia cards are so divergent at this point, the matrix of benchmarks you need to run to get a comprehensive overview of their capabilities is really exploding in complexity. RTX on/off, DLSS on/off, Rage Mode on/off, SAM on/off. Games which are known to favor AMD, games which are known to favor Nvidia. Games which seemingly explicitly disfavor AMD more so than others (like Minecraft RTX).

    I haven't seen the hardware reviewers job look this complicated in a long, long ass time. Possibly not since the mid 90's when 3D was new, and seemingly every card has their own API with wildly different support.

    Yeah but they're not doing it because it's too complex, which is 100% fair (and why even in-depth sites like GN only have a few total benchmarks)

    They're not doing it because they think it's unfair

    That is actually pretty much a mischaracterization by you. HUB aren't prioritizing raytracing/dlss testing in initial reviews because the number of games that support them are relatively small still and the number of people with cards that can use those features right now is also small.

    Also DLSS quality depending on mode used isn't a cut and dry comparison of like per like at a resolution which further complicates things.

    That said there is a video coming tomorrow on DLSS and Raytracing performance from HUB about Cyberpunk 2077...

    Hardware Unboxed in my opinion is doing the best computer hardware technology review work among YouTubers right now and their coverage of the 3080 series has been what made me want and waste a bunch of time to get one and its really stupid of Nvidia to take them off the review sample list.

    I don't always agree with their conclusions but they are presenting the data objectively in a fashion that lets me come to my own conclusions based upon my own biases and budget and wants.

    I mean, nah.

    They've said they only want apples to apples testing of video cards, and you're saying the exact same thing they're saying:
    Also DLSS quality depending on mode used isn't a cut and dry comparison of like per like at a resolution which further complicates things.

    Easy fix. Do them all, have a screenshot of the worst feathering, call it a review. Let your users decide instead of deciding for them what they should care about. Or do one setting, allowing your viewers to extrapolate how that scales. Not doing it because it's complicated is kind of silly.

    And it's great they have DLSS and RT in a CP video, but they should have them in all of their GPU reviews because it's tech that's going to be around from now on, and it's a part of why people are spending their money.

    They're flagship features, they're in demand as has been pretty readily proven, and if I'm going to watch a tech reviewer it's not gonna be some dude with his head up his ass about new tech, instead opting to play Mr. Gatekeeper McPurityTest.

    And refusing to test my hardware's flagship features because of arbitrary standards would get you yanked off my freebie list as well.

    Lol Easy fix... increase your workload by 6x for minimal gain. So simple.

    Edit: 4 modes of DLSS with and without RT would be 8x actually.

    Incindium on
    steam_sig.png
    Nintendo ID: Incindium
    PSN: IncindiumX
This discussion has been closed.