I'm curious to hear your opinion on building in the 275R.
I was thinking of getting that case at one point.
I'm most of the way through the build and have some thoughts. First, I'd give the case a 7/10. I wish I'd spent more money but it's a good value.
Not that this case isn't nice. I've also built in much more expensive cases that were way WAY worse. It's got cable cutouts everywhere you need em including some places you might not expect but come in handy. There are plenty of loops for zip ties and cable management. The layout is generous where it counts. I can just barely do 4x fans in push/pull with my H100 in the top of the case for exhaust. It looks like I'll be able to do push/pull with the GPU rad too. It looks like I'll be able to route all my cables out of sight, behind the case. I've already got most of them back there. There are plenty of little dark corners to tuck cables. It of course has that legendary Corsair attention to detail. No sharp edges, all the cutouts are rolled (to prevent cutting into your cables), paint is consistent, there are tons of extra screws, and everything fits well together.
Now to the cons. I've built in better cases. My cast case was better. If I had any bigger heatsinks on the ram or mobo they would have gotten in the way of the bottom fan, making push/pull harder/impossible with the H100. At least not without those super slim fans. I'd still be able to do one or the other, with tons of room to spare, though. I rather unfortunately learned that you have to mount the water block to the CPU before mounting the rad to the case. Despite the generous area for components, it is TIGHT everywhere else. I struggled more than a little bit to get all the cables poked through all the right holes after I had mounted the fans and rad. Make sure you route all your cables through the cutouts before mounting that stuff and it'll be a lot easier. Very few of the cutouts have rubber grommets on them. A more expensive model would have had them on nearly every cutout indeed for cable routing as well as more extra space for cable routing after components are mounted.
I'm getting kind of tired so I think I'm going to have to finish the build tomorrow. I'll probably take off work early tomorrow around lunchtime to come home and finish it up.
I wonder how far the monitor makers will push resolution, will it actually be beyond what 99% of the population can resolve with their eyes? Will there be an arms race for bionic eyes and higher resolution displays? The mind boggles.
I just want an ultra widescreen 1440p display. That would make me happy.
Look to phones.
...
yes, they will. If consumers will swallow their line of bullshit.
0
jungleroomxIt's never too many graves, it's always not enough shovelsRegistered Userregular
I wonder how far the monitor makers will push resolution, will it actually be beyond what 99% of the population can resolve with their eyes? Will there be an arms race for bionic eyes and higher resolution displays? The mind boggles.
I just want an ultra widescreen 1440p display. That would make me happy.
Look to phones.
...
yes, they will. If consumers will swallow their line of bullshit.
There's an actual physical limitation right now with screens and pixels and pushing past it is going to cost a lot more than other things like HDR and other things.
Yes companies are trying to push shit like 8k, but it's going to be extremely expensive for quite some time.
0
jungleroomxIt's never too many graves, it's always not enough shovelsRegistered Userregular
edited August 2019
Mild overclock bringing my 3600 up a bit, mem overclock done, XMP profiles loaded, AIDA64 stability test done capping out at a chilly 72C.
I wonder how far the monitor makers will push resolution, will it actually be beyond what 99% of the population can resolve with their eyes? Will there be an arms race for bionic eyes and higher resolution displays? The mind boggles.
I just want an ultra widescreen 1440p display. That would make me happy.
I only have a 6ft DVI cable for my second monitor, so I grabbed a DP->DVI cable that was 10ft long, but it won't go to 1440p on my monitor. I'd like to move away from DVI-DVI, so is there a certain brand or spec I should check when I look for a replacement DP-DVI cable? Just check for supported resolutions; or is there a limit to the spec?
For the record, my main monitor is DP-DP and does 1440p just fine.
I personally don't notice a difference between 100 and 144 but it's different person to person.
A lot of people don't. Shockingly, there is no solid scientifically-tested consensus on exactly what every person, at every possible distance (and within a pretty limited number of viewing angles) can see in a monitor refresh rate. And that's before considering what your actual hardware is capable of putting out with consistency. A friend of mine doesn't easily notice the difference between 30 and 60 hz, a literal hundred percent increase...though maybe that's years of playing games on a PSP having destroyed his eyes. That strikes me as shocking, but it'd be an incredibly douche-y thing to claim that I know his own eyes better than he does.
I only have a 6ft DVI cable for my second monitor, so I grabbed a DP->DVI cable that was 10ft long, but it won't go to 1440p on my monitor. I'd like to move away from DVI-DVI, so is there a certain brand or spec I should check when I look for a replacement DP-DVI cable? Just check for supported resolutions; or is there a limit to the spec?
For the record, my main monitor is DP-DP and does 1440p just fine.
(EVGA GTX 1080)
That's surprising. I have a 50 ft DVI to HDMI cable that does 2160p at 60 hz (no HDR, but then again, HDR10 didn't exist a decade ago when I bought it). So your monitor itself is using a DVI input?
I already got this Logitech G203 RMA'd once for the left mouse button flaking out, and the replacement is already starting to do it. It hasn't even been a month.
What's my best bet for a mouse that isn't going to make me want to throw it against the wall? I'd prefer 5 button if possible. Not a fan of wireless.
One of the front USB ports on my Meshify C has stopped working. There had been something plugged into it that got jarred and it hasn't worked since then. Do I have any recourse here? It isn't loose or anything, but I never undid any screws on the back of the front panel to investigate further.
If it's a USB 3.0 slot, they come loose from the MB pretty easily. Maybe pop the side off and make sure it's still seated snug.
Good suggestion, but it didn't solve the problem. I reseated the connector a couple times and no bueno. I'll try taking apart the front panel connectors and see if anything obvious shows up.
The Define R6 has a modular front USB port thingy. Worst case scenario you might be able to get it replaced?
Doesn't look like that works for the meshify C unless I have misunderstood it.
To close this loop, I contacted Fractal and they are going to send me a replacement IO panel under warranty. They needed some info (S/N, model, receipt, etc) but nothing too onerous.
I wonder how far the monitor makers will push resolution, will it actually be beyond what 99% of the population can resolve with their eyes? Will there be an arms race for bionic eyes and higher resolution displays? The mind boggles.
I just want an ultra widescreen 1440p display. That would make me happy.
Look to phones.
...
yes, they will. If consumers will swallow their line of bullshit.
BouwsTWanna come to a super soft birthday party?Registered Userregular
Anybody have some visceral reaction to "Asus Factory Recertified" GPU's? I think I'm going to pull the trigger on that one Aridhol pointed out earlier. I don't expect it'll be a problem, and 500 bucks is 500 bucks, but I thought I'd ask the hive mind anyway.
Between you and me, Peggy, I smoked this Juul and it did UNTHINKABLE things to my mind and body...
0
That_GuyI don't wanna be that guyRegistered Userregular
Anybody have some visceral reaction to "Asus Factory Recertified" GPU's? I think I'm going to pull the trigger on that one Aridhol pointed out earlier. I don't expect it'll be a problem, and 500 bucks is 500 bucks, but I thought I'd ask the hive mind anyway.
In many cases, refurbished equipment is better than brand new. When you're manufacturing a million of something it's just not feasible to have someone test each and every part that comes off the production line. With a refurbished product there's an actual human checking to make sure everything works on it before you buy it.
I love when replacing a PSU turns into taking the entire PC apart and cleaning it... but hey it's running, it's silent, cables are managed well and I couldn't be happier.
edit: also a trip to Micro Center to pick up a USB 2.0 header bridge since I didn't have enough on my mobo for front side panel, Kraken and PSU (these two require it for CAM).
I wonder how far the monitor makers will push resolution, will it actually be beyond what 99% of the population can resolve with their eyes? Will there be an arms race for bionic eyes and higher resolution displays? The mind boggles.
I just want an ultra widescreen 1440p display. That would make me happy.
Look to phones.
...
yes, they will. If consumers will swallow their line of bullshit.
I personally don't notice a difference between 100 and 144 but it's different person to person.
A lot of people don't. Shockingly, there is no solid scientifically-tested consensus on exactly what every person, at every possible distance (and within a pretty limited number of viewing angles) can see in a monitor refresh rate. And that's before considering what your actual hardware is capable of putting out with consistency. A friend of mine doesn't easily notice the difference between 30 and 60 hz, a literal hundred percent increase...though maybe that's years of playing games on a PSP having destroyed his eyes. That strikes me as shocking, but it'd be an incredibly douche-y thing to claim that I know his own eyes better than he does.
I only have a 6ft DVI cable for my second monitor, so I grabbed a DP->DVI cable that was 10ft long, but it won't go to 1440p on my monitor. I'd like to move away from DVI-DVI, so is there a certain brand or spec I should check when I look for a replacement DP-DVI cable? Just check for supported resolutions; or is there a limit to the spec?
For the record, my main monitor is DP-DP and does 1440p just fine.
(EVGA GTX 1080)
That's surprising. I have a 50 ft DVI to HDMI cable that does 2160p at 60 hz (no HDR, but then again, HDR10 didn't exist a decade ago when I bought it). So your monitor itself is using a DVI input?
The monitor in this case predates Freesync (by maybe 3-6 mos) and it's a Monoprice monitor. It's only got VGA and DVI inputs. It's my own fault because the specs on the cable said 1900x1200 max and I didn't bother reading that part. I suspect it's DP 1.0 instead of...1.2? and so there's a limit due to bandwidth, etc.
Also because DVI, one of the anti-pull nuts came off when I unscrewed the old cable.
I personally don't notice a difference between 100 and 144 but it's different person to person.
A lot of people don't. Shockingly, there is no solid scientifically-tested consensus on exactly what every person, at every possible distance (and within a pretty limited number of viewing angles) can see in a monitor refresh rate. And that's before considering what your actual hardware is capable of putting out with consistency. A friend of mine doesn't easily notice the difference between 30 and 60 hz, a literal hundred percent increase...though maybe that's years of playing games on a PSP having destroyed his eyes. That strikes me as shocking, but it'd be an incredibly douche-y thing to claim that I know his own eyes better than he does.
I only have a 6ft DVI cable for my second monitor, so I grabbed a DP->DVI cable that was 10ft long, but it won't go to 1440p on my monitor. I'd like to move away from DVI-DVI, so is there a certain brand or spec I should check when I look for a replacement DP-DVI cable? Just check for supported resolutions; or is there a limit to the spec?
For the record, my main monitor is DP-DP and does 1440p just fine.
(EVGA GTX 1080)
That's surprising. I have a 50 ft DVI to HDMI cable that does 2160p at 60 hz (no HDR, but then again, HDR10 didn't exist a decade ago when I bought it). So your monitor itself is using a DVI input?
The monitor in this case predates Freesync (by maybe 3-6 mos) and it's a Monoprice monitor. It's only got VGA and DVI inputs. It's my own fault because the specs on the cable said 1900x1200 max and I didn't bother reading that part. I suspect it's DP 1.0 instead of...1.2? and so there's a limit due to bandwidth, etc.
Also because DVI, one of the anti-pull nuts came off when I unscrewed the old cable.
Ah, that makes sense. I've just had three monitors since the dropped DVI from displays (which I was initially upset about). At least with DVI, the number of pins the the actual cable itself can sometimes give you an idea of what to expect (unlike HDMI, DisplayPort, etc.).
As for the little nuts, that constantly happens on my GTX 1080ti (as I'm sure it does for most people who still use DVI outputs between gramophone playings and riding around on a metal bicycle with one really tall wheel in the front). Screw it back on really tightly with a pair of pliers (you could even use a small amount of Loctite/thread glue, it's almost perfect for monitor furniture).
0
That_GuyI don't wanna be that guyRegistered Userregular
I've got everything installed. Now time to cable manage. I was able to do push/pull with both rads! I should be able to start loading windows in an hour or 2. I think I'm going to get one of those Corsair Link fan/rgb hubs this week. I want more control over everything than I'm going to get with my current setup. The RGB controller my fans came with isn't software controlled and just has a bunch of basic patterns.
So I've reconsidered my opinion on going AMD (I will hold everyone here who recommended AMD personally responsible if anything goes wrong :P).
I'm just ripping out the guts of my current computer so it's not a full build. This is what I was looking at. However pcpartpicker is giving me warning about this mobo. "Some AMD X470 chipset motherboards may need a BIOS update prior to using Zen 2 CPUs. Upgrading the BIOS may require a different CPU that is supported by older BIOS revisions." Is that an issue?
So I've reconsidered my opinion on going AMD (I will hold everyone here who recommended AMD personally responsible if anything goes wrong :P).
I'm just ripping out the guts of my current computer so it's not a full build. This is what I was looking at. However pcpartpicker is giving me warning about this mobo. "Some AMD X470 chipset motherboards may need a BIOS update prior to using Zen 2 CPUs. Upgrading the BIOS may require a different CPU that is supported by older BIOS revisions." Is that an issue?
Any particular reason you're going for that motherboard? I would save a bunch of cash and go with a cheaper x570 which also would answer your question about compatibility.
It's likely but not guaranteed that boards will be updated for Ryzen 3xxx series but it will depend on the retailer.
If you get a 370/350/470/450 board you may need to update the bios which will require a ryzen 1xxx or 2xxx series chip.
So I've reconsidered my opinion on going AMD (I will hold everyone here who recommended AMD personally responsible if anything goes wrong :P).
I'm just ripping out the guts of my current computer so it's not a full build. This is what I was looking at. However pcpartpicker is giving me warning about this mobo. "Some AMD X470 chipset motherboards may need a BIOS update prior to using Zen 2 CPUs. Upgrading the BIOS may require a different CPU that is supported by older BIOS revisions." Is that an issue?
Any particular reason you're going for that motherboard? I would save a bunch of cash and go with a cheaper x570 which also would answer your question about compatibility.
It's likely but not guaranteed that boards will be updated for Ryzen 3xxx series but it will depend on the retailer.
If you get a 370/350/470/450 board you may need to update the bios which will require a ryzen 1xxx or 2xxx series chip.
So I've reconsidered my opinion on going AMD (I will hold everyone here who recommended AMD personally responsible if anything goes wrong :P).
I'm just ripping out the guts of my current computer so it's not a full build. This is what I was looking at. However pcpartpicker is giving me warning about this mobo. "Some AMD X470 chipset motherboards may need a BIOS update prior to using Zen 2 CPUs. Upgrading the BIOS may require a different CPU that is supported by older BIOS revisions." Is that an issue?
Any particular reason you're going for that motherboard? I would save a bunch of cash and go with a cheaper x570 which also would answer your question about compatibility.
It's likely but not guaranteed that boards will be updated for Ryzen 3xxx series but it will depend on the retailer.
If you get a 370/350/470/450 board you may need to update the bios which will require a ryzen 1xxx or 2xxx series chip.
Just an FYI the X470 Aorus looks like it's Ryzen3k ready, at least according to the Gigabyte website.
I figure most of em are/will be ready from retailers now but I just meant a $240 motherboard is a bit much for a $200 CPU. Unless Apostate has a good reason for a nice board like that.
So I've reconsidered my opinion on going AMD (I will hold everyone here who recommended AMD personally responsible if anything goes wrong :P).
I'm just ripping out the guts of my current computer so it's not a full build. This is what I was looking at. However pcpartpicker is giving me warning about this mobo. "Some AMD X470 chipset motherboards may need a BIOS update prior to using Zen 2 CPUs. Upgrading the BIOS may require a different CPU that is supported by older BIOS revisions." Is that an issue?
Any particular reason you're going for that motherboard? I would save a bunch of cash and go with a cheaper x570 which also would answer your question about compatibility.
It's likely but not guaranteed that boards will be updated for Ryzen 3xxx series but it will depend on the retailer.
If you get a 370/350/470/450 board you may need to update the bios which will require a ryzen 1xxx or 2xxx series chip.
So I've reconsidered my opinion on going AMD (I will hold everyone here who recommended AMD personally responsible if anything goes wrong :P).
I'm just ripping out the guts of my current computer so it's not a full build. This is what I was looking at. However pcpartpicker is giving me warning about this mobo. "Some AMD X470 chipset motherboards may need a BIOS update prior to using Zen 2 CPUs. Upgrading the BIOS may require a different CPU that is supported by older BIOS revisions." Is that an issue?
Any particular reason you're going for that motherboard? I would save a bunch of cash and go with a cheaper x570 which also would answer your question about compatibility.
It's likely but not guaranteed that boards will be updated for Ryzen 3xxx series but it will depend on the retailer.
If you get a 370/350/470/450 board you may need to update the bios which will require a ryzen 1xxx or 2xxx series chip.
Just an FYI the X470 Aorus looks like it's Ryzen3k ready, at least according to the Gigabyte website.
I figure most of em are/will be ready from retailers now but I just meant a $240 motherboard is a bit much for a $200 CPU. Unless Apostate has a good reason for a nice board like that.
Oh yeah.
I'd totally go with a B450M unless he's planning on doing some major overclocking.
So I've reconsidered my opinion on going AMD (I will hold everyone here who recommended AMD personally responsible if anything goes wrong :P).
I'm just ripping out the guts of my current computer so it's not a full build. This is what I was looking at. However pcpartpicker is giving me warning about this mobo. "Some AMD X470 chipset motherboards may need a BIOS update prior to using Zen 2 CPUs. Upgrading the BIOS may require a different CPU that is supported by older BIOS revisions." Is that an issue?
Any particular reason you're going for that motherboard? I would save a bunch of cash and go with a cheaper x570 which also would answer your question about compatibility.
It's likely but not guaranteed that boards will be updated for Ryzen 3xxx series but it will depend on the retailer.
If you get a 370/350/470/450 board you may need to update the bios which will require a ryzen 1xxx or 2xxx series chip.
Any suggestions for a better cooler since I'm saving some money there?
So with the cooler the one on the 3600 is ok and free but if I was to spend money I'd get a good one that you can likely keep for many years. Something like a noctua dh14 or be quiet dark rock 4 or 4 pro.
There are also "cool" looking AIOs (closed loop water coolers) like the h100i. I happen to like the look and the bling but your tastes may be different.
Basically my advice is to either keep the free cooler or go for something great that will last a long time.
Also, ryzen 3000 series benefits a lot from cooler temps as it can boost higher and longer.
Dark Rock 4 pro is my favorite / recommendation but make sure it fits in your case!
So this would be a great base for a system but you will have to check with the retailer about whether or not the board has the latest bios or have a friend with an older Ryzen chip who can help you update it.
OR just do what someone else did above and get a local store to update it cheap
Advantages here are the cooler is gonna make that chip icy cold and you'll have great boost performance and it will last between builds.
The Tomahawk board is also a great B450 board.
I already got this Logitech G203 RMA'd once for the left mouse button flaking out, and the replacement is already starting to do it. It hasn't even been a month.
What's my best bet for a mouse that isn't going to make me want to throw it against the wall? I'd prefer 5 button if possible. Not a fan of wireless.
One of the front USB ports on my Meshify C has stopped working. There had been something plugged into it that got jarred and it hasn't worked since then. Do I have any recourse here? It isn't loose or anything, but I never undid any screws on the back of the front panel to investigate further.
If it's a USB 3.0 slot, they come loose from the MB pretty easily. Maybe pop the side off and make sure it's still seated snug.
Good suggestion, but it didn't solve the problem. I reseated the connector a couple times and no bueno. I'll try taking apart the front panel connectors and see if anything obvious shows up.
The Define R6 has a modular front USB port thingy. Worst case scenario you might be able to get it replaced?
Doesn't look like that works for the meshify C unless I have misunderstood it.
To close this loop, I contacted Fractal and they are going to send me a replacement IO panel under warranty. They needed some info (S/N, model, receipt, etc) but nothing too onerous.
That's great news! I was worried cause I was doing a deep dive and couldn't find one for sale anywhere, which lead me to reading Meshify C's manual and doing a dozen Youtube videos and I couldn't even tell if it was removable. (It looked like the entire front part of the case was one piece).
0
That_GuyI don't wanna be that guyRegistered Userregular
You want color? I got your color right here. I give you the official unavailing of my new gaming PC.
The very real next standard is going to be 4K60 with really granular HDR local dimming. Right now we're in the middle of another LED/PLASMA/720/1080/BLURAY/HD-DVD clusterfuck where all these competing standards and tech are fighting it out. I don't think 8k becomes the norm in this, but I can definitely see 4k doing so.
I'd say in another few years the HDR local dimming matrix will be small enough and have good enough response that it will be worth getting, and in the next 5 it'll be a defacto standard.
OLEDs are taking over from backlit LED panels, "local dimming" won't mean a damned thing anymore because every single pixel is its own backlight.
0
That_GuyI don't wanna be that guyRegistered Userregular
The very real next standard is going to be 4K60 with really granular HDR local dimming. Right now we're in the middle of another LED/PLASMA/720/1080/BLURAY/HD-DVD clusterfuck where all these competing standards and tech are fighting it out. I don't think 8k becomes the norm in this, but I can definitely see 4k doing so.
I'd say in another few years the HDR local dimming matrix will be small enough and have good enough response that it will be worth getting, and in the next 5 it'll be a defacto standard.
OLEDs are taking over from backlit LED panels, "local dimming" won't mean a damned thing anymore because every single pixel is its own backlight.
OLEDS still need to solve the burn-in issue. They are fine in mobile devices where the screen isn't expected to be on for more than a few minutes at a time. In large format displays like monitors and TVs you run a serious risk of burn-in of static screen elements. I saw an OLED TV that was set playing news 24/7 and after a while a distinct burn-in pattern could be seen.
That_Guy on
+1
jungleroomxIt's never too many graves, it's always not enough shovelsRegistered Userregular
The very real next standard is going to be 4K60 with really granular HDR local dimming. Right now we're in the middle of another LED/PLASMA/720/1080/BLURAY/HD-DVD clusterfuck where all these competing standards and tech are fighting it out. I don't think 8k becomes the norm in this, but I can definitely see 4k doing so.
I'd say in another few years the HDR local dimming matrix will be small enough and have good enough response that it will be worth getting, and in the next 5 it'll be a defacto standard.
OLEDs are taking over from backlit LED panels, "local dimming" won't mean a damned thing anymore because every single pixel is its own backlight.
OLEDS still need to solve the burn-in issue. They are fine in mobile devices where the screen isn't expected to be on for more than a few minutes at a time. In large format displays like monitors and TVs you run a serious risk of burn-in of static screen elements. I saw an OLED TV that was set playing news 24/7 and after a while a distinct burn-in pattern could be seen.
Edit: Oh, it's not the response time! It's the input lag and overall shorter lifespan.
I already got this Logitech G203 RMA'd once for the left mouse button flaking out, and the replacement is already starting to do it. It hasn't even been a month.
What's my best bet for a mouse that isn't going to make me want to throw it against the wall? I'd prefer 5 button if possible. Not a fan of wireless.
One of the front USB ports on my Meshify C has stopped working. There had been something plugged into it that got jarred and it hasn't worked since then. Do I have any recourse here? It isn't loose or anything, but I never undid any screws on the back of the front panel to investigate further.
If it's a USB 3.0 slot, they come loose from the MB pretty easily. Maybe pop the side off and make sure it's still seated snug.
Good suggestion, but it didn't solve the problem. I reseated the connector a couple times and no bueno. I'll try taking apart the front panel connectors and see if anything obvious shows up.
The Define R6 has a modular front USB port thingy. Worst case scenario you might be able to get it replaced?
Doesn't look like that works for the meshify C unless I have misunderstood it.
To close this loop, I contacted Fractal and they are going to send me a replacement IO panel under warranty. They needed some info (S/N, model, receipt, etc) but nothing too onerous.
That's great news! I was worried cause I was doing a deep dive and couldn't find one for sale anywhere, which lead me to reading Meshify C's manual and doing a dozen Youtube videos and I couldn't even tell if it was removable. (It looked like the entire front part of the case was one piece).
It isn't going to be a super fun swap since I will have to re-run all the cables to the Mobo. I'm still trying to decide if one more USB port is worth the work.
0
Ear3nd1lEärendil the Mariner, father of ElrondRegistered Userregular
Posts
I'm most of the way through the build and have some thoughts. First, I'd give the case a 7/10. I wish I'd spent more money but it's a good value.
Not that this case isn't nice. I've also built in much more expensive cases that were way WAY worse. It's got cable cutouts everywhere you need em including some places you might not expect but come in handy. There are plenty of loops for zip ties and cable management. The layout is generous where it counts. I can just barely do 4x fans in push/pull with my H100 in the top of the case for exhaust. It looks like I'll be able to do push/pull with the GPU rad too. It looks like I'll be able to route all my cables out of sight, behind the case. I've already got most of them back there. There are plenty of little dark corners to tuck cables. It of course has that legendary Corsair attention to detail. No sharp edges, all the cutouts are rolled (to prevent cutting into your cables), paint is consistent, there are tons of extra screws, and everything fits well together.
Now to the cons. I've built in better cases. My cast case was better. If I had any bigger heatsinks on the ram or mobo they would have gotten in the way of the bottom fan, making push/pull harder/impossible with the H100. At least not without those super slim fans. I'd still be able to do one or the other, with tons of room to spare, though. I rather unfortunately learned that you have to mount the water block to the CPU before mounting the rad to the case. Despite the generous area for components, it is TIGHT everywhere else. I struggled more than a little bit to get all the cables poked through all the right holes after I had mounted the fans and rad. Make sure you route all your cables through the cutouts before mounting that stuff and it'll be a lot easier. Very few of the cutouts have rubber grommets on them. A more expensive model would have had them on nearly every cutout indeed for cable routing as well as more extra space for cable routing after components are mounted.
I'm getting kind of tired so I think I'm going to have to finish the build tomorrow. I'll probably take off work early tomorrow around lunchtime to come home and finish it up.
Look to phones.
...
yes, they will. If consumers will swallow their line of bullshit.
There's an actual physical limitation right now with screens and pixels and pushing past it is going to cost a lot more than other things like HDR and other things.
Yes companies are trying to push shit like 8k, but it's going to be extremely expensive for quite some time.
Build complete. Now to enjoy.
Something like this? https://www.amazon.com/dp/B07HZ4N7PJ/
It's one I'm halfway considering. It's only 100Hz, though. I wonder if a 5700XT would be enough for that?
For the record, my main monitor is DP-DP and does 1440p just fine.
(EVGA GTX 1080)
A lot of people don't. Shockingly, there is no solid scientifically-tested consensus on exactly what every person, at every possible distance (and within a pretty limited number of viewing angles) can see in a monitor refresh rate. And that's before considering what your actual hardware is capable of putting out with consistency. A friend of mine doesn't easily notice the difference between 30 and 60 hz, a literal hundred percent increase...though maybe that's years of playing games on a PSP having destroyed his eyes. That strikes me as shocking, but it'd be an incredibly douche-y thing to claim that I know his own eyes better than he does.
That's surprising. I have a 50 ft DVI to HDMI cable that does 2160p at 60 hz (no HDR, but then again, HDR10 didn't exist a decade ago when I bought it). So your monitor itself is using a DVI input?
To close this loop, I contacted Fractal and they are going to send me a replacement IO panel under warranty. They needed some info (S/N, model, receipt, etc) but nothing too onerous.
Haven't you heard?
They make human CPU coolers now!
I tried the 30 day trial, and it didn't work.
In many cases, refurbished equipment is better than brand new. When you're manufacturing a million of something it's just not feasible to have someone test each and every part that comes off the production line. With a refurbished product there's an actual human checking to make sure everything works on it before you buy it.
edit: also a trip to Micro Center to pick up a USB 2.0 header bridge since I didn't have enough on my mobo for front side panel, Kraken and PSU (these two require it for CAM).
That man doesn't believe what he's reading because he's blinking too much. He knows the product is bullshit and he's doing it for the money.
The monitor in this case predates Freesync (by maybe 3-6 mos) and it's a Monoprice monitor. It's only got VGA and DVI inputs. It's my own fault because the specs on the cable said 1900x1200 max and I didn't bother reading that part. I suspect it's DP 1.0 instead of...1.2? and so there's a limit due to bandwidth, etc.
Also because DVI, one of the anti-pull nuts came off when I unscrewed the old cable.
Ah, that makes sense. I've just had three monitors since the dropped DVI from displays (which I was initially upset about). At least with DVI, the number of pins the the actual cable itself can sometimes give you an idea of what to expect (unlike HDMI, DisplayPort, etc.).
As for the little nuts, that constantly happens on my GTX 1080ti (as I'm sure it does for most people who still use DVI outputs between gramophone playings and riding around on a metal bicycle with one really tall wheel in the front). Screw it back on really tightly with a pair of pliers (you could even use a small amount of Loctite/thread glue, it's almost perfect for monitor furniture).
I'm just ripping out the guts of my current computer so it's not a full build. This is what I was looking at. However pcpartpicker is giving me warning about this mobo. "Some AMD X470 chipset motherboards may need a BIOS update prior to using Zen 2 CPUs. Upgrading the BIOS may require a different CPU that is supported by older BIOS revisions." Is that an issue?
PCPartPicker Part List
CPU: AMD Ryzen 5 3600 3.6 GHz 6-Core Processor ($197.85 @ Amazon)
CPU Cooler: Cooler Master Hyper 212 EVO 82.9 CFM Sleeve Bearing CPU Cooler ($34.89 @ OutletPC)
Motherboard: Gigabyte X470 AORUS GAMING 7 WIFI ATX AM4 Motherboard ($239.99 @ Amazon)
Memory: Corsair Vengeance RGB Pro 16 GB (2 x 8 GB) DDR4-3200 Memory ($94.99 @ Amazon)
Total: $567.72
Prices include shipping, taxes, and discounts when available
Generated by PCPartPicker 2019-08-16 14:28 EDT-0400
Any particular reason you're going for that motherboard? I would save a bunch of cash and go with a cheaper x570 which also would answer your question about compatibility.
It's likely but not guaranteed that boards will be updated for Ryzen 3xxx series but it will depend on the retailer.
If you get a 370/350/470/450 board you may need to update the bios which will require a ryzen 1xxx or 2xxx series chip.
I would do this honestly
PCPartPicker Part List
CPU: AMD Ryzen 5 3600 3.6 GHz 6-Core Processor ($197.85 @ Amazon)
CPU Cooler: Cooler Master Hyper 212 Black Edition 42 CFM CPU Cooler ($36.89 @ OutletPC)
Motherboard: MSI MPG X570 GAMING PLUS ATX AM4 Motherboard ($159.89 @ OutletPC)
Memory: Corsair Vengeance RGB Pro 16 GB (2 x 8 GB) DDR4-3200 Memory ($94.99 @ Amazon)
Total: $489.62
Prices include shipping, taxes, and discounts when available
Generated by PCPartPicker 2019-08-16 14:39 EDT-0400
Just an FYI the X470 Aorus looks like it's Ryzen3k ready, at least according to the Gigabyte website.
I figure most of em are/will be ready from retailers now but I just meant a $240 motherboard is a bit much for a $200 CPU. Unless Apostate has a good reason for a nice board like that.
Well I definitely like better and cheaper.
Any suggestions for a better cooler since I'm saving some money there?
Oh yeah.
I'd totally go with a B450M unless he's planning on doing some major overclocking.
So with the cooler the one on the 3600 is ok and free but if I was to spend money I'd get a good one that you can likely keep for many years. Something like a noctua dh14 or be quiet dark rock 4 or 4 pro.
There are also "cool" looking AIOs (closed loop water coolers) like the h100i. I happen to like the look and the bling but your tastes may be different.
Basically my advice is to either keep the free cooler or go for something great that will last a long time.
Also, ryzen 3000 series benefits a lot from cooler temps as it can boost higher and longer.
Dark Rock 4 pro is my favorite / recommendation but make sure it fits in your case!
OR just do what someone else did above and get a local store to update it cheap
Advantages here are the cooler is gonna make that chip icy cold and you'll have great boost performance and it will last between builds.
The Tomahawk board is also a great B450 board.
PCPartPicker Part List
CPU: AMD Ryzen 5 3600 3.6 GHz 6-Core Processor ($197.85 @ Amazon)
CPU Cooler: be quiet! Dark Rock Pro 4 50.5 CFM CPU Cooler ($89.90 @ Amazon)
Motherboard: MSI B450 TOMAHAWK ATX AM4 Motherboard ($114.89 @ OutletPC)
Memory: Corsair Vengeance RGB Pro 16 GB (2 x 8 GB) DDR4-3200 Memory ($94.99 @ Amazon)
Total: $497.63
Prices include shipping, taxes, and discounts when available
Generated by PCPartPicker 2019-08-16 15:05 EDT-0400
I need to swap 2 of the RGB headers on the controller. Give me a little while and I'll hit you with a mess of bloody color.
That's great news! I was worried cause I was doing a deep dive and couldn't find one for sale anywhere, which lead me to reading Meshify C's manual and doing a dozen Youtube videos and I couldn't even tell if it was removable. (It looked like the entire front part of the case was one piece).
https://gfycat.com/accuratelightheartedkookaburra
You gotta properly tag @Hardtarget !!
OLEDs are taking over from backlit LED panels, "local dimming" won't mean a damned thing anymore because every single pixel is its own backlight.
OLEDS still need to solve the burn-in issue. They are fine in mobile devices where the screen isn't expected to be on for more than a few minutes at a time. In large format displays like monitors and TVs you run a serious risk of burn-in of static screen elements. I saw an OLED TV that was set playing news 24/7 and after a while a distinct burn-in pattern could be seen.
Edit: Oh, it's not the response time! It's the input lag and overall shorter lifespan.
Also, HDR will kill OLED's a lot quicker.
It isn't going to be a super fun swap since I will have to re-run all the cables to the Mobo. I'm still trying to decide if one more USB port is worth the work.
I was just waiting to see what happened if we said his name three times.