Haven't heard anything concrete on the embargo date but speculation is 2-3 weeks before launch based on 2xxx series. So mid June earliest which puts it around E3 time.
If all else equal then would it make sense to go with the cheaper option? Seeing 1080ti used for like $500 haven't found a 2080 yet.
Especially if you're not interested in Ray Tracing--at the moment--probably. I'm "still" using a GTX 1080ti for a 2160p setup. If anything, my aging i5 is more a bottleneck for me.
Fallout affects all processor generations we have tested. However, we notice a worrying regression, where the newer Coffee Lake R processors are more vulnerable to Fallout than older generations.
Fucking christ, Intel, what are you doing?
I would rather be accused of intransigence than tolerating genocide for the sake of everyone getting along. - @Metzger Meister
0
Options
NEO|PhyteThey follow the stars, bound together.Strands in a braid till the end.Registered Userregular
They found ANOTHER vulnerability? Jesus.
It was that somehow, from within the derelict-horror, they had learned a way to see inside an ugly, broken thing... And take away its pain.
Warframe/Steam: NFyt
Which Ryzen would be good if I want to record/stream Total War games with a 1080ti, and a possible GPU upgrade in the future?
That's going to depend a lot on how much you're willing to spend. You will probably want at least a 3600X. I play a lot of Total War and I've got my eyes locked on either the 3800x or whatever they end up calling (3990X?) that 16 core monster we'll be seeing at E3.
Which Ryzen would be good if I want to record/stream Total War games with a 1080ti, and a possible GPU upgrade in the future?
That's going to depend a lot on how much you're willing to spend. You will probably want at least a 3600X. I play a lot of Total War and I've got my eyes locked on either the 3800x or whatever they end up calling (3990X?) that 16 core monster we'll be seeing at E3.
Honestly with the 1080ti's dual NVENC chips, it can do the heavy lifting for the video encoding. Processor will mostly be the game play, so choose accordingly.
This looks even worse for Intel when you put it out there:
I'm leaning towards the absurd: The ASUS ROG Crosshair VIII Formula. Will have to see what the price is. It also says that it has a build in liquid cooling block on some of the motherboard components, so maybe going down to the Hero is better (since I don't do liquid cooling).
I wish I knew what RAM specs to aim for, the sites all just say "DDR4."
0
Options
Donovan PuppyfuckerA dagger in the dark isworth a thousand swords in the morningRegistered Userregular
These TDP ratings HAVE to be complete bullshit. How in the wide wide world of sports can the 9 3900X and the 7 3800X have the same TDP? The 9 is 50% more chip!
These TDP ratings HAVE to be complete bullshit. How in the wide wide world of sports can the 9 3900X and the 7 3800X have the same TDP? The 9 is 50% more chip!
The -800 would be made with trashier silicon, possibly,
I would rather be accused of intransigence than tolerating genocide for the sake of everyone getting along. - @Metzger Meister
These TDP ratings HAVE to be complete bullshit. How in the wide wide world of sports can the 9 3900X and the 7 3800X have the same TDP? The 9 is 50% more chip!
The -800 would be made with trashier silicon, possibly,
Okay but no, the wafer still has to work, those 8 cores still have to perform to spec, and they're on the same architecture and lithography.
So a 12 core chip specced to the same performance per core coming out of the same plant using the same architecture and built on the same process should use very nearly if not exactly the same power draw per core, and voltage regulator etc. etc. required per core.
This is like expecting a 6 litre V12 engine with 50% more power to have the exact same gas mileage as a 4 litre V8 in otherwise identical cars.
That's not how physics works! There are laws of thermodynamics which cannot be circumvented!
Intel isn't going anywhere but they're poised to lose significant ground in the desktop space which is great.
We'll see how epyc does in the datacentre over the next few years.
They've lost a serious amount of performance because of all those speculative execution hacks, it's been something like 8 in one year and AMD only got hit with what, 2 of them?
not a doctor, not a lawyer, examples I use may not be fully researched so don't take out of context plz, don't @ me
Intel isn't going anywhere but they're poised to lose significant ground in the desktop space which is great.
We'll see how epyc does in the datacentre over the next few years.
They've lost a serious amount of performance because of all those speculative execution hacks, it's been something like 8 in one year and AMD only got hit with what, 2 of them?
Intel is in all macs and 95%+ of laptops and that market dwarfs pc sales.
Equifax gave away tens of millions of peoples private financial data and doesn't appear any worse for wear....
I'm not saying intel is winning right now but "intel is dead!" is so far beyond hyperbole it's into absurdity.
These TDP ratings HAVE to be complete bullshit. How in the wide wide world of sports can the 9 3900X and the 7 3800X have the same TDP? The 9 is 50% more chip!
The -800 would be made with trashier silicon, possibly,
Okay but no, the wafer still has to work, those 8 cores still have to perform to spec, and they're on the same architecture and lithography.
So a 12 core chip specced to the same performance per core coming out of the same plant using the same architecture and built on the same process should use very nearly if not exactly the same power draw per core, and voltage regulator etc. etc. required per core.
This is like expecting a 6 litre V12 engine with 50% more power to have the exact same gas mileage as a 4 litre V8 in otherwise identical cars.
That's not how physics works! There are laws of thermodynamics which cannot be circumvented!
It's called 'binning'.
Some chips need different levels of power to hit certain levels, or won't at all; comparing the production process of V8s and CPUs is very much an apples and oranges thing - wafer manufacture is, by nature, a comparatively inexact - or more accurately, less error tolerant in comparison to V8s - process with quite widely varying results in quality. The quality often determines how much current you have to push through it to get to so and so frequency, for example, or how stable it is at so or so overclock frequency. The chiplets that are more efficient are likely being sucked up by the higher tier products/products that are more power sensitive, or other stacks, like TR or Epyc. The 3800x may getting lower binned chiplets that still have all 8 cores functional, or it may be chiplets that can hit really good frequencies but suck power while doing it. This is why, incidentally, things like Vega take so much power - there's a lot that can be undervolted down to nVidia levels and be faster due to lower thermal throttling, but not all of them could do it, so they all got set to the higher power make sure they all delivered. Same's the case with a lot of nVidia chips.
Intel isn't going anywhere but they're poised to lose significant ground in the desktop space which is great.
We'll see how epyc does in the datacentre over the next few years.
They've lost a serious amount of performance because of all those speculative execution hacks, it's been something like 8 in one year and AMD only got hit with what, 2 of them?
Intel is in all macs and 95%+ of laptops and that market dwarfs pc sales.
Equifax gave away tens of millions of peoples private financial data and doesn't appear any worse for wear....
I'm not saying intel is winning right now but "intel is dead!" is so far beyond hyperbole it's into absurdity.
I wouldn't be surprised if Apple went pure AMD in the coming gens, at least desktop, between that they already do business with AMD for their GPUs, and they might get a cheaper package deal and the security issues of Intel.
Jeep-Eep on
I would rather be accused of intransigence than tolerating genocide for the sake of everyone getting along. - @Metzger Meister
Intel isn't going anywhere but they're poised to lose significant ground in the desktop space which is great.
We'll see how epyc does in the datacentre over the next few years.
They've lost a serious amount of performance because of all those speculative execution hacks, it's been something like 8 in one year and AMD only got hit with what, 2 of them?
Intel is in all macs and 95%+ of laptops and that market dwarfs pc sales.
Equifax gave away tens of millions of peoples private financial data and doesn't appear any worse for wear....
I'm not saying intel is winning right now but "intel is dead!" is so far beyond hyperbole it's into absurdity.
I didn't say they were.
I did make a joke earlier because of how crazy their stuff is that you're probably reading too far into.
not a doctor, not a lawyer, examples I use may not be fully researched so don't take out of context plz, don't @ me
Intel isn't going anywhere but they're poised to lose significant ground in the desktop space which is great.
We'll see how epyc does in the datacentre over the next few years.
They've lost a serious amount of performance because of all those speculative execution hacks, it's been something like 8 in one year and AMD only got hit with what, 2 of them?
Intel is in all macs and 95%+ of laptops and that market dwarfs pc sales.
Equifax gave away tens of millions of peoples private financial data and doesn't appear any worse for wear....
I'm not saying intel is winning right now but "intel is dead!" is so far beyond hyperbole it's into absurdity.
I didn't say they were.
I did make a joke earlier because of how crazy their stuff is that you're probably reading too far into.
I've waited this long, so I think I'm going to try to extend this build another year because I think DDR5 is *right there* and my builds tend to go ~5 years before replacement (CPU+Mobo+RAM).
That being said, I'm going to use the rest of this year to get some gear that will transfer over: new case and probably a new CPU cooler
I've waited this long, so I think I'm going to try to extend this build another year because I think DDR5 is *right there* and my builds tend to go ~5 years before replacement (CPU+Mobo+RAM).
That being said, I'm going to use the rest of this year to get some gear that will transfer over: new case and probably a new CPU cooler
It seems weird that DDR5 is still not there in PCs. A console using GDDR5 as system RAM came out in 2013.
Posts
If the new Ryzens can be less finicky about memory when they're first released, unlike their predecessors, that's going to be hard to beat.
Ryzen memory thing was fixed with zen+
It's all gravy now.
I'm thinking cyber Monday might be a good time to upgrade my system this year.
Origin ID: Discgolfer27
Untappd ID: Discgolfer1981
Haven't heard anything concrete on the embargo date but speculation is 2-3 weeks before launch based on 2xxx series. So mid June earliest which puts it around E3 time.
Official certified benchmarks? No. However benchmarks have been leaking our left and right from relatively reliable sources.
https://wccftech.com/amd-ryzen-5-3600-zen-2-cpu-benchmark-leak-crushes-coffee-lake-price-performance/
https://hothardware.com/news/amd-ryzen-3000-16-core-zen-2-beast-cinebench-r15
Especially if you're not interested in Ray Tracing--at the moment--probably. I'm "still" using a GTX 1080ti for a 2160p setup. If anything, my aging i5 is more a bottleneck for me.
Fucking christ, Intel, what are you doing?
Warframe/Steam: NFyt
as soon as the first one was found people wondered if there were more
People were pretty sure there were a lot more they just weren't sure exactly where.
It was found last year, they disclosed it recently.
This is why I'm not giving Intel the time of day until they shitcan Coffee lake. This is Bulldozer in slow motion, a crashing fail by a thousand cuts.
And that the cumulative losses in perf from mitigation have wiped out 16% of that thing's perf in gaming, more if you're prosumer.
That's going to depend a lot on how much you're willing to spend. You will probably want at least a 3600X. I play a lot of Total War and I've got my eyes locked on either the 3800x or whatever they end up calling (3990X?) that 16 core monster we'll be seeing at E3.
https://www.anandtech.com/show/14407/amd-ryzen-3000-announced-five-cpus-12-cores-for-499-up-to-46-ghz-pcie-40-coming-77
Honestly with the 1080ti's dual NVENC chips, it can do the heavy lifting for the video encoding. Processor will mostly be the game play, so choose accordingly.
I'm leaning towards the absurd: The ASUS ROG Crosshair VIII Formula. Will have to see what the price is. It also says that it has a build in liquid cooling block on some of the motherboard components, so maybe going down to the Hero is better (since I don't do liquid cooling).
I wish I knew what RAM specs to aim for, the sites all just say "DDR4."
The -800 would be made with trashier silicon, possibly,
Nah, do it.
Intel's had it coming for years with their constant lazy bullshit.
We'll see how epyc does in the datacentre over the next few years.
Okay but no, the wafer still has to work, those 8 cores still have to perform to spec, and they're on the same architecture and lithography.
So a 12 core chip specced to the same performance per core coming out of the same plant using the same architecture and built on the same process should use very nearly if not exactly the same power draw per core, and voltage regulator etc. etc. required per core.
This is like expecting a 6 litre V12 engine with 50% more power to have the exact same gas mileage as a 4 litre V8 in otherwise identical cars.
That's not how physics works! There are laws of thermodynamics which cannot be circumvented!
They've lost a serious amount of performance because of all those speculative execution hacks, it's been something like 8 in one year and AMD only got hit with what, 2 of them?
Intel is in all macs and 95%+ of laptops and that market dwarfs pc sales.
Equifax gave away tens of millions of peoples private financial data and doesn't appear any worse for wear....
I'm not saying intel is winning right now but "intel is dead!" is so far beyond hyperbole it's into absurdity.
It's called 'binning'.
Some chips need different levels of power to hit certain levels, or won't at all; comparing the production process of V8s and CPUs is very much an apples and oranges thing - wafer manufacture is, by nature, a comparatively inexact - or more accurately, less error tolerant in comparison to V8s - process with quite widely varying results in quality. The quality often determines how much current you have to push through it to get to so and so frequency, for example, or how stable it is at so or so overclock frequency. The chiplets that are more efficient are likely being sucked up by the higher tier products/products that are more power sensitive, or other stacks, like TR or Epyc. The 3800x may getting lower binned chiplets that still have all 8 cores functional, or it may be chiplets that can hit really good frequencies but suck power while doing it. This is why, incidentally, things like Vega take so much power - there's a lot that can be undervolted down to nVidia levels and be faster due to lower thermal throttling, but not all of them could do it, so they all got set to the higher power make sure they all delivered. Same's the case with a lot of nVidia chips.
I wouldn't be surprised if Apple went pure AMD in the coming gens, at least desktop, between that they already do business with AMD for their GPUs, and they might get a cheaper package deal and the security issues of Intel.
I didn't say they were.
I did make a joke earlier because of how crazy their stuff is that you're probably reading too far into.
I was
That being said, I'm going to use the rest of this year to get some gear that will transfer over: new case and probably a new CPU cooler
I suspect they may be like Pre-Zen AMD for a while - decent GPUs for a budget, kind of lackluster CPU.
Intel doesn't make GPUs except for their integrated stuff right?
They're supposed to have discrete GPU's on the market by 2020
It seems weird that DDR5 is still not there in PCs. A console using GDDR5 as system RAM came out in 2013.
Steam | XBL