I was debating on creating a thread about setting up a NAS, but maybe it's better served here for now.
I currently have a retired HP Z420 workstation functioning as a NAS and Plex server. It's been reliably running 24/7 for 4 or 5 years now, but I'm trying to be a little more environmentally conscious and move to something more efficient. I wasn't sure how bad it was until I looked up an old Anandtech review, which showed it burning 90W at idle. Yikes!
I recently picked up an NVIDIA Shield TV Pro, which has had some minor teething issues but seems like it could replace the Plex server role adequately. It's consuming something around 10-20W when serving media and 2-3W at idle. I still need to pair it with a NAS to hold all my media and handle general file storage and backup. I've considered building an rPi NAS or a prebuilt drive enclosure, but I'm curious if I can just build a modern pc that's power efficient enough to be competitive.
Ideally I'd like something that can maintain 20-25W at idle for a pure NAS, which would cut my power consumption in half. If I can build a server I'd be willing to bump that number up a bit, since I could also use it for Plex and leave the Shield as a streaming platform only. Used HP Microservers seem like it might be a good candidate in this area - what do you think?
As far as PCs go, I doubt you could hit the idle draw that a rPi is capable of. And now, with the rPi 4 Compute modules having an actual 1x PCIe lane you can build a decent 4 drive NAS with one of those.
Other than that, the only thing that may approach that efficiency would be a prebuilt NAS that used an ARM processor. One thing to keep in mind also is that NAS grade spinning disks consume some level of power, even if they aren't doing anything. They don't park the heads and spin down nearly as often as consumer drives do (if they do at all). The PC I built for a NAS a few years ago has a slightly more modern processor than the one you are using, and kept to about 30-40 watts at idle before drives, but with 8 NAS grade disks installed it usually consumed 90W+.
Steam - Synthetic Violence | XBOX Live - Cannonfuse | PSN - CastleBravo | Twitch - SoggybiscuitPA
My friend is talking about getting a prebuilt with a 2070 super and an i7 10700F for $1300. Bonus points for also using a HDD.
Try as I might, I cannot show him the light.
Friend of mine normally builds his machines, started pricing out stuff from ibuypower and others, and I was seriously shocked at the margins on stuff. If you throw in a 5900x? $1100 increase. Want a 3080 in there? $1100 increase.
Jesus tapdancing christ, it's cheaper to pay a scalper and build it yourself!
My friend is talking about getting a prebuilt with a 2070 super and an i7 10700F for $1300. Bonus points for also using a HDD.
Try as I might, I cannot show him the light.
Friend of mine normally builds his machines, started pricing out stuff from ibuypower and others, and I was seriously shocked at the margins on stuff. If you throw in a 5900x? $1100 increase. Want a 3080 in there? $1100 increase.
Jesus tapdancing christ, it's cheaper to pay a scalper and build it yourself!
So, here's where we may start seeing ray tracing normalize a bit.
The Medium is the first major game targeted at next gen consoles also getting a PC release. Those next-gen consoles have AMD metal in them, so obviously if that was the target, they built the systems out for that.
**image snipped**
So with a game that had a dev pipeline with targets of good RT performance on next gen consoles, they are placing the RX 6800 in the 2080/3060 nvidia range, and the 6800 XT against the 3080, insofar as recommended hardware for RT is concerned.
I still believe the 3080 will get more frames, especially if AMD's answer to DLSS remains unlaunched, but that strikes me as a lot better than many would have expected.
This is kind of what I'm expecting going forward.
RT performance will (mostly) get better as we go along.
Mostly because if you build with a console focused rt implementation you're going to be 99% raster limited. Or even with amd focused implementations, like uh..... Godfalll?
0
TavIrish Minister for DefenceRegistered Userregular
i have a fractal design case that was originally launched in 2014 and discontinued a couple of years ago
it's well out of warranty but they're sending me two hard drive cages for free inc free postage
great customer service. My next case was gonna be a fractal design anyway but this fuckin' rules
+11
syndalisGetting ClassyOn the WallRegistered User, Loves Apple Productsregular
So, here's where we may start seeing ray tracing normalize a bit.
The Medium is the first major game targeted at next gen consoles also getting a PC release. Those next-gen consoles have AMD metal in them, so obviously if that was the target, they built the systems out for that.
**image snipped**
So with a game that had a dev pipeline with targets of good RT performance on next gen consoles, they are placing the RX 6800 in the 2080/3060 nvidia range, and the 6800 XT against the 3080, insofar as recommended hardware for RT is concerned.
I still believe the 3080 will get more frames, especially if AMD's answer to DLSS remains unlaunched, but that strikes me as a lot better than many would have expected.
This is kind of what I'm expecting going forward.
RT performance will (mostly) get better as we go along.
Mostly because if you build with a console focused rt implementation you're going to be 99% raster limited. Or even with amd focused implementations, like uh..... Godfalll?
Also because nvidia is a hardware manufacturer, who hasn't had rockstar, EA, Sony and xbox division money out there figuring out more efficient ways to use their RT hardware - expectations of performance based on what is effectively "first pass" software solutions will not be representative of how ray tracing will perform going forward.
You won't be seeing shittier ray tracing, you will be seeing more efficient, better ray tracing, as more folks get in the pool who own engines and massive multinational divisions making games, and that is what these consoles are going to provide for everyone, nvidia included.
syndalis on
SW-4158-3990-6116
Let's play Mario Kart or something...
I dunno, this feels a lot like the FXAA situation we had this gen. The consoles weren't powerful enough to do more accurate AA implimentations, so instead we ended up with everyone using a shitty one that killed IQ, because it was universal and had next to no performance impact.
I dunno, this feels a lot like the FXAA situation we had this gen. The consoles weren't powerful enough to do more accurate AA implimentations, so instead we ended up with everyone using a shitty one that killed IQ, because it was universal and had next to no performance impact.
This is my read as well. By most measures I've seen, Nvidia's hardware has about 2x the pure path tracing compute power as AMD's hardware. It stands out most starkly in Minecraft RTX and Quake 2 RTX which are pure path tracing games. Too much of AMD's RT pipeline competes for resources with their standard compute cores.
If AMD's capabilities become the standard thanks to their ubiquity on console, I think it will hamstring PC gaming's visual fidelity for a generation or more. Not unlike the dark times of PS3/X360, where most games seemed to target that hardware first, and then PC ports that really didn't make the best use of PC hardware were what passed for "AAA" releases. Real PC releases that aggressively pushed the hardware were few and far between.
I think we'll see developers shy away from the stunning results purely path traced games can deliver, towards what a lot of the earliest RTX games had which was marginally better reflections and shadows. Maybe RT global illumination too. Maybe some limited diffuse lighting. Probably never all of the above on console. Options to turn them all on and up on PC. But I think there is a real possibility the Nvidia RTX pipeline will be left wanting more to do on console oriented RT implementations. And I highly doubt we'll see any more fully path traced games that aren't explicitly sponsored by Nvidia. Not like we've seen one yet that wasn't anyways.
I dunno, this feels a lot like the FXAA situation we had this gen. The consoles weren't powerful enough to do more accurate AA implimentations, so instead we ended up with everyone using a shitty one that killed IQ, because it was universal and had next to no performance impact.
This is my read as well. By most measures I've seen, Nvidia's hardware has about 2x the pure path tracing compute power as AMD's hardware. It stands out most starkly in Minecraft RTX and Quake 2 RTX which are pure path tracing games. Too much of AMD's RT pipeline competes for resources with their standard compute cores.
If AMD's capabilities become the standard thanks to their ubiquity on console, I think it will hamstring PC gaming's visual fidelity for a generation or more. Not unlike the dark times of PS3/X360, where most games seemed to target that hardware first, and then PC ports that really didn't make the best use of PC hardware were what passed for "AAA" releases. Real PC releases that aggressively pushed the hardware were few and far between.
I think we'll see developers shy away from the stunning results purely path traced games can deliver, towards what a lot of the earliest RTX games had which was marginally better reflections and shadows. Maybe RT global illumination too. Maybe some limited diffuse lighting. Probably never all of the above on console. Options to turn them all on and up on PC. But I think there is a real possibility the Nvidia RTX pipeline will be left wanting more to do on console oriented RT implementations. And I highly doubt we'll see any more fully path traced games that aren't explicitly sponsored by Nvidia. Not like we've seen one yet that wasn't anyways.
Or, we are wasting a shitload on pathing we can't even see because engines are being inefficient, and games like Minecraft exemplify this because they just turned the math on more or less. Performance is bad even when stuff immaterial to what is being rendered is in frame but blocked by walls or whatever.
I would suggest watching the RTX demo for this game mentioned earlier. It looks really good, good use of reflections, etc.
SW-4158-3990-6116
Let's play Mario Kart or something...
I dunno, this feels a lot like the FXAA situation we had this gen. The consoles weren't powerful enough to do more accurate AA implimentations, so instead we ended up with everyone using a shitty one that killed IQ, because it was universal and had next to no performance impact.
This is my read as well. By most measures I've seen, Nvidia's hardware has about 2x the pure path tracing compute power as AMD's hardware. It stands out most starkly in Minecraft RTX and Quake 2 RTX which are pure path tracing games. Too much of AMD's RT pipeline competes for resources with their standard compute cores.
If AMD's capabilities become the standard thanks to their ubiquity on console, I think it will hamstring PC gaming's visual fidelity for a generation or more. Not unlike the dark times of PS3/X360, where most games seemed to target that hardware first, and then PC ports that really didn't make the best use of PC hardware were what passed for "AAA" releases. Real PC releases that aggressively pushed the hardware were few and far between.
I think we'll see developers shy away from the stunning results purely path traced games can deliver, towards what a lot of the earliest RTX games had which was marginally better reflections and shadows. Maybe RT global illumination too. Maybe some limited diffuse lighting. Probably never all of the above on console. Options to turn them all on and up on PC. But I think there is a real possibility the Nvidia RTX pipeline will be left wanting more to do on console oriented RT implementations. And I highly doubt we'll see any more fully path traced games that aren't explicitly sponsored by Nvidia. Not like we've seen one yet that wasn't anyways.
That said, it's probably still better to have what handful of effects we can claw away from the console's limited capabilities so that the devs can start working with the pipeline, instead of just waiting for another generation to even get started. I have full faith that come the mid-generation refreshes, we'll have better implimentations. Even if it's probably not until the actual next generation where we start having games that use a full RT lighting engine, instead of one that is 90% raster tricks
0
syndalisGetting ClassyOn the WallRegistered User, Loves Apple Productsregular
edited January 2021
what nvidia doing isn't a true raster pipeline right now anyways, as you are relying heavily on machine learning temporal rendering to fake all sorts of stuff and create a pleasing image at an acceptable resolution and framerate.
Which is kind of my point. DLSS 2.0 is such a dramatic jump over DLSS 1.0, on the same hardware, that people saying we are hardware locked on features that make shinies better just makes me scratch my head. You have proof right there that we will likely see a ton of advancement in what similar hardware can do over time, just by developing better engines and drivers and APIs for them.
syndalis on
SW-4158-3990-6116
Let's play Mario Kart or something...
all of those recommended settings on The Medium target 30fps, which simply isn't how PC gamers will approach picking settings. as such, not super useful IMO
+3
jungleroomxIt's never too many graves, it's always not enough shovelsRegistered Userregular
I dunno, this feels a lot like the FXAA situation we had this gen. The consoles weren't powerful enough to do more accurate AA implimentations, so instead we ended up with everyone using a shitty one that killed IQ, because it was universal and had next to no performance impact.
We're kind of getting to the point where AA is going to be fairly useless, except for cleaning up a few noticeable edges, and it's probably best if we don't spend 20% of our system resources doing SSAA
I dunno, this feels a lot like the FXAA situation we had this gen. The consoles weren't powerful enough to do more accurate AA implimentations, so instead we ended up with everyone using a shitty one that killed IQ, because it was universal and had next to no performance impact.
We're kind of getting to the point where AA is going to be fairly useless, except for cleaning up a few noticeable edges, and it's probably best if we don't spend 20% of our system resources doing SSAA
Maybe if we get to 8k in under 30", but aliasing is still a huge issue at 4k. The inability to tweak it in cyberpunk is killing me (even though it clearly has taa)
PSU/RAM arrived today for wife's build, time to complete it!
I moved the stock Fractal 140mm fans that came in my S2 over to replace the single front 120mm Fractal in her Meshify C, just need to get the PSU in and motherboard plugged in so I can put the AIO back on top.
5800X / 3080 TUF build, fits snugly but nicely into the C with Corsair 240mm AIO.
I dunno, this feels a lot like the FXAA situation we had this gen. The consoles weren't powerful enough to do more accurate AA implimentations, so instead we ended up with everyone using a shitty one that killed IQ, because it was universal and had next to no performance impact.
We're kind of getting to the point where AA is going to be fairly useless, except for cleaning up a few noticeable edges, and it's probably best if we don't spend 20% of our system resources doing SSAA
Maybe if we get to 8k in under 30", but aliasing is still a huge issue at 4k. The inability to tweak it in cyberpunk is killing me (even though it clearly has taa)
Aliasing is my #2 issue in PC gaming (after screen tearing). Any jaggies at all just make my teeth grind. AA is still hugely important, at least at the resolution I play at (1440p on a 27" monitor).
Price aside, this makes sense to me. I think Nvidia is realizing what a horrible mistake it was discontinuing the entire RTX 2000 line as early as they did. Once upon a time, last gens graphics cards were the budget option. There was no "budget" Geforce 256. You got a Riva TNT2 if you were on a budget.
To the best of my knowledge, The RTX 2060 uses an entirely different supply chain than the RTX 3000 series as well. So this shouldn't cannibalize the supply of the more sought after 3000 series cards.
That said, MSRP for these is bonkers. Then again, maybe that's what the market can bare right now.
+4
OrcaAlso known as EspressosaurusWrexRegistered Userregular
Jesus
I guess I'm going to be sitting on this 1080 Ti longer than I thought
Might just sit out the video card arena for a few more years
@Orca I feel the same way. My 1080 Ti can still do what I need it to do at 1440p. Yeah, I'm missing out on DLSS and Ray Tracing, but those are just 'nice to haves'.
The money that I had set asside for a 3080/6800xt is now going to a different hobby.
I'll try again in 2022.
" I am a warrior, so that my son may be a merchant, so that his son may be a poet.”
― John Quincy Adams
0
jungleroomxIt's never too many graves, it's always not enough shovelsRegistered Userregular
I dunno, this feels a lot like the FXAA situation we had this gen. The consoles weren't powerful enough to do more accurate AA implimentations, so instead we ended up with everyone using a shitty one that killed IQ, because it was universal and had next to no performance impact.
We're kind of getting to the point where AA is going to be fairly useless, except for cleaning up a few noticeable edges, and it's probably best if we don't spend 20% of our system resources doing SSAA
Maybe if we get to 8k in under 30", but aliasing is still a huge issue at 4k. The inability to tweak it in cyberpunk is killing me (even though it clearly has taa)
I dunno, this feels a lot like the FXAA situation we had this gen. The consoles weren't powerful enough to do more accurate AA implimentations, so instead we ended up with everyone using a shitty one that killed IQ, because it was universal and had next to no performance impact.
We're kind of getting to the point where AA is going to be fairly useless, except for cleaning up a few noticeable edges, and it's probably best if we don't spend 20% of our system resources doing SSAA
Maybe if we get to 8k in under 30", but aliasing is still a huge issue at 4k. The inability to tweak it in cyberpunk is killing me (even though it clearly has taa)
This must be on TV's
I barely notice aliasing at 1440p on a 32"
Uh... no.
I have this and I definitely notice aliasing, and it sucks.
Price aside, this makes sense to me. I think Nvidia is realizing what a horrible mistake it was discontinuing the entire RTX 2000 line as early as they did. Once upon a time, last gens graphics cards were the budget option. There was no "budget" Geforce 256. You got a Riva TNT2 if you were on a budget.
To the best of my knowledge, The RTX 2060 uses an entirely different supply chain than the RTX 3000 series as well. So this shouldn't cannibalize the supply of the more sought after 3000 series cards.
That said, MSRP for these is bonkers. Then again, maybe that's what the market can bare right now.
the pricing is nonsense. The 2060 super prices there are higher than the 3060!
I dunno, this feels a lot like the FXAA situation we had this gen. The consoles weren't powerful enough to do more accurate AA implimentations, so instead we ended up with everyone using a shitty one that killed IQ, because it was universal and had next to no performance impact.
We're kind of getting to the point where AA is going to be fairly useless, except for cleaning up a few noticeable edges, and it's probably best if we don't spend 20% of our system resources doing SSAA
Maybe if we get to 8k in under 30", but aliasing is still a huge issue at 4k. The inability to tweak it in cyberpunk is killing me (even though it clearly has taa)
This must be on TV's
I barely notice aliasing at 1440p on a 32"
Uh... no.
I have this and I definitely notice aliasing, and it sucks.
Yeah, aliasing is huge problem on my 27" 4k monitor
I think part of the trick to actually building gaming PCs this past year has been: be outside the USA?
Which isn't much help to folks, but despite the USA sources being frequent drops, they're also the most targeted by bots. As a Canadian, my USA attempts have not panned out at all, and all my acquisitions have been through much slower but steadier CA sources.
Backorders through local stores, online drops on CA-only stores, etc.
USA on the other hand feels very "free market" where you're up against bots and scalpers, and fighting the crab bucket / trying to win the lottery. Way more stressful even if people are getting more inventory out there?
Tonight I'll be putting together second new PC build and it's turned out for us as:
Release day camping store for Ryzen, got a 5800X.
Online drop on Canadian source for 3090 RTX, was shared by friend and purchasable by humans.
Store backorder came in for 3080 RTX, just before Christmas.
Friends backorder came in for 5900X in January, didn't want it any longer, bought it from him.
+1
jungleroomxIt's never too many graves, it's always not enough shovelsRegistered Userregular
I dunno, this feels a lot like the FXAA situation we had this gen. The consoles weren't powerful enough to do more accurate AA implimentations, so instead we ended up with everyone using a shitty one that killed IQ, because it was universal and had next to no performance impact.
We're kind of getting to the point where AA is going to be fairly useless, except for cleaning up a few noticeable edges, and it's probably best if we don't spend 20% of our system resources doing SSAA
Maybe if we get to 8k in under 30", but aliasing is still a huge issue at 4k. The inability to tweak it in cyberpunk is killing me (even though it clearly has taa)
This must be on TV's
I barely notice aliasing at 1440p on a 32"
Uh... no.
I have this and I definitely notice aliasing, and it sucks.
Yeah, aliasing is huge problem on my 27" 4k monitor
Tbf my eyes are garbage, now that I think about it
I did it with Canada Computers, and it seems the particular card I chose, the TUF 3080 non-oc is basically non-existent, but the other cards have queues a mile long.
The only silver lining was I paid $899cdn, although I half expect them to just cancel my order due to being getting in at a cheaper rate.
I'd like a 5900x, but that seems unlikely to happen ever this year. I'm trying to decide if I want to just grab a 5800x next time they're in stock at Microcenter, or just grab a 3600x for now to build out the whole computer (without GPU) and grab a 58 or 5900x in the future sometime. With the console core/thread counts being what they are, I figure if I can match or exceed that I'm golden for a good long while.
Do I have to care about motherboard VRMs if I don't want to fuss around with overclocking? Is there a good resource for seeing if RAM will clear a Noctua
air cooler?
Spica enjoys the butt-warming features of the current machine and can often be found here when it's on:
But she also likes watching TV, meaning that since my desk can't fit a second monitor, I still have one set up just to keep her from sitting in front of me:
Corvus tends to focus his attention elsewhere when a computer is involved
Really, he's a total cuddle monster
Spica resents the implication that she's not also a total cuddle monster
Posts
As far as PCs go, I doubt you could hit the idle draw that a rPi is capable of. And now, with the rPi 4 Compute modules having an actual 1x PCIe lane you can build a decent 4 drive NAS with one of those.
Other than that, the only thing that may approach that efficiency would be a prebuilt NAS that used an ARM processor. One thing to keep in mind also is that NAS grade spinning disks consume some level of power, even if they aren't doing anything. They don't park the heads and spin down nearly as often as consumer drives do (if they do at all). The PC I built for a NAS a few years ago has a slightly more modern processor than the one you are using, and kept to about 30-40 watts at idle before drives, but with 8 NAS grade disks installed it usually consumed 90W+.
Friend of mine normally builds his machines, started pricing out stuff from ibuypower and others, and I was seriously shocked at the margins on stuff. If you throw in a 5900x? $1100 increase. Want a 3080 in there? $1100 increase.
Jesus tapdancing christ, it's cheaper to pay a scalper and build it yourself!
Don't get a 3 year warranty with scalpers.
And the 3080 is still $1400-$1500 on Ebay
Mostly because if you build with a console focused rt implementation you're going to be 99% raster limited. Or even with amd focused implementations, like uh..... Godfalll?
it's well out of warranty but they're sending me two hard drive cages for free inc free postage
great customer service. My next case was gonna be a fractal design anyway but this fuckin' rules
Also because nvidia is a hardware manufacturer, who hasn't had rockstar, EA, Sony and xbox division money out there figuring out more efficient ways to use their RT hardware - expectations of performance based on what is effectively "first pass" software solutions will not be representative of how ray tracing will perform going forward.
You won't be seeing shittier ray tracing, you will be seeing more efficient, better ray tracing, as more folks get in the pool who own engines and massive multinational divisions making games, and that is what these consoles are going to provide for everyone, nvidia included.
Let's play Mario Kart or something...
This is my read as well. By most measures I've seen, Nvidia's hardware has about 2x the pure path tracing compute power as AMD's hardware. It stands out most starkly in Minecraft RTX and Quake 2 RTX which are pure path tracing games. Too much of AMD's RT pipeline competes for resources with their standard compute cores.
If AMD's capabilities become the standard thanks to their ubiquity on console, I think it will hamstring PC gaming's visual fidelity for a generation or more. Not unlike the dark times of PS3/X360, where most games seemed to target that hardware first, and then PC ports that really didn't make the best use of PC hardware were what passed for "AAA" releases. Real PC releases that aggressively pushed the hardware were few and far between.
I think we'll see developers shy away from the stunning results purely path traced games can deliver, towards what a lot of the earliest RTX games had which was marginally better reflections and shadows. Maybe RT global illumination too. Maybe some limited diffuse lighting. Probably never all of the above on console. Options to turn them all on and up on PC. But I think there is a real possibility the Nvidia RTX pipeline will be left wanting more to do on console oriented RT implementations. And I highly doubt we'll see any more fully path traced games that aren't explicitly sponsored by Nvidia. Not like we've seen one yet that wasn't anyways.
I still haven't played it much at all but when I do it's so pretty
Inquisitor77: Rius, you are Sisyphus and melee Wizard is your boulder
Tube: This must be what it felt like to be an Iraqi when Saddam was killed
Bookish Stickers - Mrs. Rius' Etsy shop with bumper stickers and vinyl decals.
Or, we are wasting a shitload on pathing we can't even see because engines are being inefficient, and games like Minecraft exemplify this because they just turned the math on more or less. Performance is bad even when stuff immaterial to what is being rendered is in frame but blocked by walls or whatever.
I would suggest watching the RTX demo for this game mentioned earlier. It looks really good, good use of reflections, etc.
Let's play Mario Kart or something...
That said, it's probably still better to have what handful of effects we can claw away from the console's limited capabilities so that the devs can start working with the pipeline, instead of just waiting for another generation to even get started. I have full faith that come the mid-generation refreshes, we'll have better implimentations. Even if it's probably not until the actual next generation where we start having games that use a full RT lighting engine, instead of one that is 90% raster tricks
Which is kind of my point. DLSS 2.0 is such a dramatic jump over DLSS 1.0, on the same hardware, that people saying we are hardware locked on features that make shinies better just makes me scratch my head. You have proof right there that we will likely see a ton of advancement in what similar hardware can do over time, just by developing better engines and drivers and APIs for them.
Let's play Mario Kart or something...
We're kind of getting to the point where AA is going to be fairly useless, except for cleaning up a few noticeable edges, and it's probably best if we don't spend 20% of our system resources doing SSAA
Maybe if we get to 8k in under 30", but aliasing is still a huge issue at 4k. The inability to tweak it in cyberpunk is killing me (even though it clearly has taa)
I moved the stock Fractal 140mm fans that came in my S2 over to replace the single front 120mm Fractal in her Meshify C, just need to get the PSU in and motherboard plugged in so I can put the AIO back on top.
5800X / 3080 TUF build, fits snugly but nicely into the C with Corsair 240mm AIO.
Aliasing is my #2 issue in PC gaming (after screen tearing). Any jaggies at all just make my teeth grind. AA is still hugely important, at least at the resolution I play at (1440p on a 27" monitor).
https://www.newegg.com/product-shuffle
"NewEgg Shuffle"
TLDR enter a drawing to win a chance to purchase horrible combos
Penny Arcade Rockstar Social Club / This is why I despise cyclists
And this page has already expired
If they're moving to a lottery system AND still bundling in unnecessary shit then fuck newegg even more
Inquisitor77: Rius, you are Sisyphus and melee Wizard is your boulder
Tube: This must be what it felt like to be an Iraqi when Saddam was killed
Bookish Stickers - Mrs. Rius' Etsy shop with bumper stickers and vinyl decals.
Price aside, this makes sense to me. I think Nvidia is realizing what a horrible mistake it was discontinuing the entire RTX 2000 line as early as they did. Once upon a time, last gens graphics cards were the budget option. There was no "budget" Geforce 256. You got a Riva TNT2 if you were on a budget.
To the best of my knowledge, The RTX 2060 uses an entirely different supply chain than the RTX 3000 series as well. So this shouldn't cannibalize the supply of the more sought after 3000 series cards.
That said, MSRP for these is bonkers. Then again, maybe that's what the market can bare right now.
I guess I'm going to be sitting on this 1080 Ti longer than I thought
Might just sit out the video card arena for a few more years
The money that I had set asside for a 3080/6800xt is now going to a different hobby.
I'll try again in 2022.
― John Quincy Adams
This must be on TV's
I barely notice aliasing at 1440p on a 32"
Uh... no.
I have this and I definitely notice aliasing, and it sucks.
the pricing is nonsense. The 2060 super prices there are higher than the 3060!
Yeah, aliasing is huge problem on my 27" 4k monitor
Corvus might doing a QA pass on the new case, but there's absolutely nothing to put in it.
Which isn't much help to folks, but despite the USA sources being frequent drops, they're also the most targeted by bots. As a Canadian, my USA attempts have not panned out at all, and all my acquisitions have been through much slower but steadier CA sources.
Backorders through local stores, online drops on CA-only stores, etc.
USA on the other hand feels very "free market" where you're up against bots and scalpers, and fighting the crab bucket / trying to win the lottery. Way more stressful even if people are getting more inventory out there?
Tonight I'll be putting together second new PC build and it's turned out for us as:
Release day camping store for Ryzen, got a 5800X.
Online drop on Canadian source for 3090 RTX, was shared by friend and purchasable by humans.
Store backorder came in for 3080 RTX, just before Christmas.
Friends backorder came in for 5900X in January, didn't want it any longer, bought it from him.
Tbf my eyes are garbage, now that I think about it
I've called 3 times to ask about it, and each time they just laugh at me...
Trying to track it through discord or other channels are even worse as it is catered for US retailers.
3070 or 3080? impossible.
If Antonline hadn't done their silly "preorder" that nobody else trusted, I wouldn't have a 5900x.
If Best Buy hadn't actually tried doing something to stop botters, I wouldn't have a 6800.
Where/who with? I didn't pay nothing to be on backorder, Memory Express just calls me when I'm up and then I make a purchase.
Microcenter sounds like the closest equivalent to my experience, but that doesn't work if you don't have one available.
The only silver lining was I paid $899cdn, although I half expect them to just cancel my order due to being getting in at a cheaper rate.
Do I have to care about motherboard VRMs if I don't want to fuss around with overclocking? Is there a good resource for seeing if RAM will clear a Noctua
air cooler?
I'll give you pictures of both cats:
Spica enjoys the butt-warming features of the current machine and can often be found here when it's on:
But she also likes watching TV, meaning that since my desk can't fit a second monitor, I still have one set up just to keep her from sitting in front of me:
Corvus tends to focus his attention elsewhere when a computer is involved
Really, he's a total cuddle monster
Spica resents the implication that she's not also a total cuddle monster