I'm an avid (yet terrible) Rainbow Six Siege player and I've noticed that they've released Vulkan API functionality for it. Doing some research, Vulkan should take some strain off my GPU and put it on the CPU (i think?). Is my pc Vulkan compatible?
And are there any benchmark tools you guys recommend?
Considering what you already have, I would wait for the next CPU generation to upgrade. You already have 6 cores. GPU could obviously be upgraded at any point. How's your SSD situation?
I've got a 250gig SSD that's chugging along. It's fine so far as the only modernish game I play is siege and worst case I can just uninstall that if there's a space issue.
My question was more "if I run the vulkan API option on games, will my pc cope". Which it seems it will?
I'm playing at 1080p mind
I wouldn't bother upgrading anything unless you're not getting the performance you want out of your rig.
I'm an avid (yet terrible) Rainbow Six Siege player and I've noticed that they've released Vulkan API functionality for it. Doing some research, Vulkan should take some strain off my GPU and put it on the CPU (i think?). Is my pc Vulkan compatible?
And are there any benchmark tools you guys recommend?
Considering what you already have, I would wait for the next CPU generation to upgrade. You already have 6 cores. GPU could obviously be upgraded at any point. How's your SSD situation?
I've got a 250gig SSD that's chugging along. It's fine so far as the only modernish game I play is siege and worst case I can just uninstall that if there's a space issue.
My question was more "if I run the vulkan API option on games, will my pc cope". Which it seems it will?
I'm playing at 1080p mind
For 1080p I think you're fine tbh, and will be for a couple of years yet.
0
Options
GnomeTankWhat the what?Portland, OregonRegistered Userregular
edited March 2020
Why would you not use Nvidia's next cards specifically because they are on a 10nm process node? Nvidia has a history of making very good high transistor count GPU's on "older" process nodes. Buy a card based on it's performance and features, not a marketing number that has effectively lost all meaning. Once FinFET was introduced transistor density started going up without the gate size actually getting smaller, turning the number in to a pure marketing exercise that can very loosely translate to real world power efficiency and numbers....but a good architecture on a "10nm process" is still going to be better than a bad or mediocre one on a "7nm process" (of course Navi so far is a great architecture, but it being 7nm doesn't immediately make it the winner).
She's gonna keep using the gtx 1080, and then the same case and PSU for now.
Should be a decent upgrade from the FX8150 she is using currently.
I'm a little sad to hear Nvidia's new offering seems to be 10nm. Might have switch over to team red for my next GPU.
The new Xbox and Playstation give us a preview of what Navi 2 is going to be like, and so far that looks pretty damb good. If AMD can't bring out something decently competitive this generation, they'll be underperforming expectations.
We don't have nearly as much of an indication of how much Nvidia are going to improve performance, but every indication is that Navi 2 will be a lot better than Navi 1.
0
Options
GnomeTankWhat the what?Portland, OregonRegistered Userregular
Was just going to ask, is the nearly double cost of 570 vs 470 worth it? ASUS ROG Strix X470-F is $170 and the X570-E is $299.
Not unless you very specifically need PCI-e 4.0 it's not. Only other consideration is making sure that particular X470 board ships with a bios capable of running Ryzen 3000 chips (assuming that's what you're putting on it), so you don't need an older Ryzen chip to get it to boot and do a BIOS update.
She's gonna keep using the gtx 1080, and then the same case and PSU for now.
Should be a decent upgrade from the FX8150 she is using currently.
I'm a little sad to hear Nvidia's new offering seems to be 10nm. Might have switch over to team red for my next GPU.
The new Xbox and Playstation give us a preview of what Navi 2 is going to be like, and so far that looks pretty damb good. If AMD can't bring out something decently competitive this generation, they'll be underperforming expectations.
We don't have nearly as much of an indication of how much Nvidia are going to improve performance, but every indication is that Navi 2 will be a lot better than Navi 1.
Theres been a lot of rumor mill whispering that RTX 3000 will have a 50% boost watt for watt.
0
Options
GnomeTankWhat the what?Portland, OregonRegistered Userregular
I mean there's a reason Nvidia has been super silent while AMD's been out there talking up Navi 2. Either Nvidia thinks they are in trouble, or they are supremely confident. We won't really know until cards hit the wild. It's a win win for those of us in the ultra-premium graphics card market either way.
I mean there's a reason Nvidia has been super silent while AMD's been out there talking up Navi 2. Either Nvidia thinks they are in trouble, or they are supremely confident. We won't really know until cards hit the wild. It's a win win for those of us in the ultra-premium graphics card market either way.
I hope AMD doesnt ratchet up the price to match Nvidia so we can try to force that shit down a bit.
AMD really needs to go like for like this card gen, like they did in the CPU space. Remember, Intel had to cut their prices in half to compete (which tells you what kind if margins they were gettiing).
+2
Options
GnomeTankWhat the what?Portland, OregonRegistered Userregular
edited March 2020
If we're talking purely in the space of rumors, the rumored reason Nvidia is sticking to a 10nm process is specifically to bring the cost down and yields up so that prices can fall back a bit in the assumption that competition from AMD will be real this time. 7nm EUV fab space is really hard to come by right now, with really only TSMC having it in full production at scale. Samsung is still not quite there, and who really knows where Intel is since they don't sell their fab space to third parties.
While Nvidia certainly gouged the prices of the Turing cards because they could, having no serious competition, Turing GPU's were incredibly dense and the yields, especially of the TU-102 in the 2080 Ti, were not stellar.
ShadowfireVermont, in the middle of nowhereRegistered Userregular
This time off from work has been bad. I've been playing with PC part picker and figuring out if we can put together a couple new computers. Right now I have an i5-3570k and a 660Ti which lol anything could be an upgrade. The wife is on a fifth gen i7 with a 960 I think.
It's not just on the 10nm that i would be swapping over, just the rumours have definitely changed their tunes as of late. At first we were hearing 60-80% increase and now that I'm hearing they've fallen back to 10nm the expected gains for next gen are 40%.
Of course it's still all speculation. Just interesting both consoles are using AMD, and nvidia moved back their announcement.
I'd like to stick with nVidia just so I can continue using Moonlight. But for sure AMD is stepping up there game, and I'd jump over for that price/performance difference.
Regarding the motherboard, it's just less hassle going for x570 and I've been using it with no issue. Money isn't really a problem for her, I had sent her for b450 and x470 boards but she wanted to go with that.
0
Options
GnomeTankWhat the what?Portland, OregonRegistered Userregular
edited March 2020
Both consoles are using AMD for the same reason both consoles used AMD last gen: Only AMD can offer a complete SoC with an x86 CPU and competitive GPU. The fact that AMD seems to be able to compete on the GPU front this time is a nice bonus.
I for one am basically stuck using Nvidia cards because my primary monitor is a G-Sync that's too old to be compatible with the new standard that allows some G-Sync monitors to work with AMD cards. That's not to say I couldn't just buy a new monitor but that's an added expense I have zero interest in factoring into a new computer build.
Next one, definitely, will be a 144hz HDR Freesync, I think.
I can't recommend that enough.
Well I'm looking for a monitor I can use for a few years before it's horribly outdated.
HDR, FreeSync, and 120+hz are all must haves, though I am waffling between 4K and 1440p, especially since if I need the frames I can kick the resolution down to 1080p on the 4k and not get the blurries.
0
Options
GnomeTankWhat the what?Portland, OregonRegistered Userregular
I just upgraded my TV specifically because I wanted one with HDR, VRR and 120hz. It was a good decision.
ShadowfireVermont, in the middle of nowhereRegistered Userregular
edited March 2020
So I continue fucking around with parts. Playing with this right now.
Ryzen 5 3600
ASRock B450M Pro 4
16GB DDR4-3200
500GB M.2
EVGA GTX1660 Super
EVGA 500W Bronze+ PSU
Unknown case! Who knows!
I'm looking around 750 right now but would love to bring it down some. I have a 1080 monitor and don't plan on upgrading it right now, but would like to leave room for a 1440 a year or so away.
She's gonna keep using the gtx 1080, and then the same case and PSU for now.
Should be a decent upgrade from the FX8150 she is using currently.
I'm a little sad to hear Nvidia's new offering seems to be 10nm. Might have switch over to team red for my next GPU.
The new Xbox and Playstation give us a preview of what Navi 2 is going to be like, and so far that looks pretty damb good. If AMD can't bring out something decently competitive this generation, they'll be underperforming expectations.
We don't have nearly as much of an indication of how much Nvidia are going to improve performance, but every indication is that Navi 2 will be a lot better than Navi 1.
Theres been a lot of rumor mill whispering that RTX 3000 will have a 50% boost watt for watt.
50% watt for watt?
Did Nvidia invent an entirely new method of computing?
Did they discover a new material to replace silicon in ICs that makes everything run at double speed?
A 50% computational increase watt for watt is the kind of pie in the sky dream generational jump that would put Nvidia ahead for multiple generations, believing in that is like believing in the tooth fairy.
0
Options
jungleroomxIt's never too many graves, it's always not enough shovelsRegistered Userregular
She's gonna keep using the gtx 1080, and then the same case and PSU for now.
Should be a decent upgrade from the FX8150 she is using currently.
I'm a little sad to hear Nvidia's new offering seems to be 10nm. Might have switch over to team red for my next GPU.
The new Xbox and Playstation give us a preview of what Navi 2 is going to be like, and so far that looks pretty damb good. If AMD can't bring out something decently competitive this generation, they'll be underperforming expectations.
We don't have nearly as much of an indication of how much Nvidia are going to improve performance, but every indication is that Navi 2 will be a lot better than Navi 1.
Theres been a lot of rumor mill whispering that RTX 3000 will have a 50% boost watt for watt.
50% watt for watt?
Did Nvidia invent an entirely new method of computing?
Did they discover a new material to replace silicon in ICs that makes everything run at double speed?
A 50% computational increase watt for watt is the kind of pie in the sky dream generational jump that would put Nvidia ahead for multiple generations, believing in that is like believing in the tooth fairy.
I mean
AMD kinda just did it with the 4900 vs the 9880.
And of course if we're talking raytracing, the compute units in the 2000 series are probably incredibly primitive and wasteful.
I'm fairly certain that an increase of 50% over the previous generation with the same TDP isn't exactly some mindblowing thing, especially with new tech being onboarded. So uh... yeah
She's gonna keep using the gtx 1080, and then the same case and PSU for now.
Should be a decent upgrade from the FX8150 she is using currently.
I'm a little sad to hear Nvidia's new offering seems to be 10nm. Might have switch over to team red for my next GPU.
The new Xbox and Playstation give us a preview of what Navi 2 is going to be like, and so far that looks pretty damb good. If AMD can't bring out something decently competitive this generation, they'll be underperforming expectations.
We don't have nearly as much of an indication of how much Nvidia are going to improve performance, but every indication is that Navi 2 will be a lot better than Navi 1.
Theres been a lot of rumor mill whispering that RTX 3000 will have a 50% boost watt for watt.
50% watt for watt?
Did Nvidia invent an entirely new method of computing?
Did they discover a new material to replace silicon in ICs that makes everything run at double speed?
A 50% computational increase watt for watt is the kind of pie in the sky dream generational jump that would put Nvidia ahead for multiple generations, believing in that is like believing in the tooth fairy.
That's odd because it's exactly the figure AMD are claiming for Navi 2.
Ofc they're starting from a lower baseline.
0
Options
jungleroomxIt's never too many graves, it's always not enough shovelsRegistered Userregular
edited March 2020
I mean
Unless the implication is we've hit the very peak of physics on what 1 watt of energy can get a processor to output, and cannot ever get more efficient.
I'm willing to bet that's not true, however
jungleroomx on
0
Options
GnomeTankWhat the what?Portland, OregonRegistered Userregular
Yeah, 50% inside an existing TDP envelope isn't a unicorn or mythical creature. It's impressive but not physics breaking.
That horizontal line near the bottom of the screen. It coincides roughly with where the taskbar on my computer ends (I moved the taskbar to take the photo).
0
Options
jungleroomxIt's never too many graves, it's always not enough shovelsRegistered Userregular
So I continue fucking around with parts. Playing with this right now.
Ryzen 5 3600
ASRock B450M Pro 4
16GB DDR4-3200
500GB M.2
EVGA GTX1660 Super
EVGA 500W Bronze+ PSU
Unknown case! Who knows!
I'm looking around 750 right now but would love to bring it down some. I have a 1080 monitor and don't plan on upgrading it right now, but would like to leave room for a 1440 a year or so away.
That's a solid bang-per-buck spec. I think anything that you cut there, you're going to be slipping down slope where you lose a lot of bang to save not much bucks. Probably your only room for flex is the video card. Maybe you could pick up a cheap used R580 to tide you over?
That horizontal line near the bottom of the screen. It coincides roughly with where the taskbar on my computer ends (I moved the taskbar to take the photo).
It shouldn't have too much burn in no but that definitely looks like it. If that doesn't change with scene changes (games etc.) then that's burn in
0
Options
GnomeTankWhat the what?Portland, OregonRegistered Userregular
In theory burn in can happen on basically any panel type. Some panel types are just more or less resistant to it.
That horizontal line near the bottom of the screen. It coincides roughly with where the taskbar on my computer ends (I moved the taskbar to take the photo).
It shouldn't have too much burn in no but that definitely looks like it. If that doesn't change with scene changes (games etc.) then that's burn in
Could it be retention? Run one of those weird pixel refreshers for a while and see if it helps?
That horizontal line near the bottom of the screen. It coincides roughly with where the taskbar on my computer ends (I moved the taskbar to take the photo).
It shouldn't have too much burn in no but that definitely looks like it. If that doesn't change with scene changes (games etc.) then that's burn in
Could it be retention? Run one of those weird pixel refreshers for a while and see if it helps?
It's worth a shot. I fixed a "broken" plasma by doing that.
Those pixels look pretty blown out to me though
I did some research and it looks like it's image persistence due to the taskbar being displayed all the time. I loaded up about:blank and fullscreened for a few hours and that seemed to help; I don't see the bar any more.
I think I'll keep my taskbar hidden in the future just to be safe.
Posts
Lower is better, right?
I wouldn't bother upgrading anything unless you're not getting the performance you want out of your rig.
3700x
x570 TUF mobo
2x16 @ 3600mhz
1 TB Samsung NVME
She's gonna keep using the gtx 1080, and then the same case and PSU for now.
Should be a decent upgrade from the FX8150 she is using currently.
I'm a little sad to hear Nvidia's new offering seems to be 10nm. Might have switch over to team red for my next GPU.
If AMD can continue their trajectory they've started with the 5700XT and come up with a competitive raytracing system, yeah
I used to use a BT USB dongle but having more then one BT device connected really messes them up.
For 1080p I think you're fine tbh, and will be for a couple of years yet.
The new Xbox and Playstation give us a preview of what Navi 2 is going to be like, and so far that looks pretty damb good. If AMD can't bring out something decently competitive this generation, they'll be underperforming expectations.
We don't have nearly as much of an indication of how much Nvidia are going to improve performance, but every indication is that Navi 2 will be a lot better than Navi 1.
Not unless you very specifically need PCI-e 4.0 it's not. Only other consideration is making sure that particular X470 board ships with a bios capable of running Ryzen 3000 chips (assuming that's what you're putting on it), so you don't need an older Ryzen chip to get it to boot and do a BIOS update.
Unless you plan on getting PCIE Gen 4 hardware any time soon, probably not.
Theres been a lot of rumor mill whispering that RTX 3000 will have a 50% boost watt for watt.
I hope AMD doesnt ratchet up the price to match Nvidia so we can try to force that shit down a bit.
AMD really needs to go like for like this card gen, like they did in the CPU space. Remember, Intel had to cut their prices in half to compete (which tells you what kind if margins they were gettiing).
While Nvidia certainly gouged the prices of the Turing cards because they could, having no serious competition, Turing GPU's were incredibly dense and the yields, especially of the TU-102 in the 2080 Ti, were not stellar.
Time off to think about things is bad!
Of course it's still all speculation. Just interesting both consoles are using AMD, and nvidia moved back their announcement.
I'd like to stick with nVidia just so I can continue using Moonlight. But for sure AMD is stepping up there game, and I'd jump over for that price/performance difference.
Regarding the motherboard, it's just less hassle going for x570 and I've been using it with no issue. Money isn't really a problem for her, I had sent her for b450 and x470 boards but she wanted to go with that.
Next one, definitely, will be a 144hz HDR Freesync, I think.
I can't recommend that enough.
Well I'm looking for a monitor I can use for a few years before it's horribly outdated.
HDR, FreeSync, and 120+hz are all must haves, though I am waffling between 4K and 1440p, especially since if I need the frames I can kick the resolution down to 1080p on the 4k and not get the blurries.
Ryzen 5 3600
ASRock B450M Pro 4
16GB DDR4-3200
500GB M.2
EVGA GTX1660 Super
EVGA 500W Bronze+ PSU
Unknown case! Who knows!
I'm looking around 750 right now but would love to bring it down some. I have a 1080 monitor and don't plan on upgrading it right now, but would like to leave room for a 1440 a year or so away.
50% watt for watt?
Did Nvidia invent an entirely new method of computing?
Did they discover a new material to replace silicon in ICs that makes everything run at double speed?
A 50% computational increase watt for watt is the kind of pie in the sky dream generational jump that would put Nvidia ahead for multiple generations, believing in that is like believing in the tooth fairy.
I mean
AMD kinda just did it with the 4900 vs the 9880.
And of course if we're talking raytracing, the compute units in the 2000 series are probably incredibly primitive and wasteful.
I'm fairly certain that an increase of 50% over the previous generation with the same TDP isn't exactly some mindblowing thing, especially with new tech being onboarded. So uh... yeah
That's odd because it's exactly the figure AMD are claiming for Navi 2.
Ofc they're starting from a lower baseline.
Unless the implication is we've hit the very peak of physics on what 1 watt of energy can get a processor to output, and cannot ever get more efficient.
I'm willing to bet that's not true, however
https://imgur.com/vKFxAZc
That horizontal line near the bottom of the screen. It coincides roughly with where the taskbar on my computer ends (I moved the taskbar to take the photo).
That's a solid bang-per-buck spec. I think anything that you cut there, you're going to be slipping down slope where you lose a lot of bang to save not much bucks. Probably your only room for flex is the video card. Maybe you could pick up a cheap used R580 to tide you over?
It shouldn't have too much burn in no but that definitely looks like it. If that doesn't change with scene changes (games etc.) then that's burn in
Could it be retention? Run one of those weird pixel refreshers for a while and see if it helps?
It's worth a shot. I fixed a "broken" plasma by doing that.
Those pixels look pretty blown out to me though
I think I'll keep my taskbar hidden in the future just to be safe.
Steam: pazython