Well, the 3070ti showed up (specifically, EVGA RTX 3070 Ti FTW3). After running it through a few games (HZD, CP77, Control, FF15, and a few others), I think I might have to pop the fans off and redo the TIM and/or thermal pads/putty. On initial load to try things out, Control was hitting low-80C. Enabling Vsync and setting the frame rate limiter to 60fps got the temps to settle down. So now, I've limited frame rates across the board to 60fps to see how things shake out:
Cyberpunk 2077: mid to upper 70C
Horizon Zero Dawn: upper 70s (peaking out at 79C)
Control: ~67C was the peak
FF15: again, upper 70s (peaked to 79C)
Mechwarrior 5: peaked to 81C in the dropship with a full load of shiny mechs
There were a few that I tried out that hung out in the 60C or less neighborhood (but those were things like CoD: Cold War, Monster Hunter World, and Wasteland 3)...but overall, things seem pretty warm.
And this is after I set up a fairly aggressive fan profile with MSI Afterburner. I might also look into undervolting the thing a smidge...
Look into UVing, but these cards just do run kinda hot. I was pegged at 83 the whole time in control on my 3080
+2
The_SpaniardIt's never lupinesIrvine, CaliforniaRegistered Userregular
I just skipped ahead to the next step and started the RMA process. They want me to cover my own shipping, and want me to get tracking and insurance on it. It looks like they refurbish and make you sit on ass and wait in the mean time, "In-House Turn-Around Time (excluding shipping) 15-35 business days".
Well, the 3070ti showed up (specifically, EVGA RTX 3070 Ti FTW3). After running it through a few games (HZD, CP77, Control, FF15, and a few others), I think I might have to pop the fans off and redo the TIM and/or thermal pads/putty. On initial load to try things out, Control was hitting low-80C. Enabling Vsync and setting the frame rate limiter to 60fps got the temps to settle down. So now, I've limited frame rates across the board to 60fps to see how things shake out:
Cyberpunk 2077: mid to upper 70C
Horizon Zero Dawn: upper 70s (peaking out at 79C)
Control: ~67C was the peak
FF15: again, upper 70s (peaked to 79C)
Mechwarrior 5: peaked to 81C in the dropship with a full load of shiny mechs
There were a few that I tried out that hung out in the 60C or less neighborhood (but those were things like CoD: Cold War, Monster Hunter World, and Wasteland 3)...but overall, things seem pretty warm.
And this is after I set up a fairly aggressive fan profile with MSI Afterburner. I might also look into undervolting the thing a smidge...
You probably saw it in the MWO/MW5/BT thread already, but someone noticed that MW5 menus tends to drive the framerate high which lead to the video cards getting warm/hot. I limited my 3060 Ti to 60FPS and the fans never got loud after that (I don't have any fancy software installed to monitor the temps).
I've had this Samsung 850 EVO SSD 500GB for a few years now. Its written 68 TB of data. Samsung magician says its still good, but I worry about it. No idea how long these things are supposed to last.
Well, the 3070ti showed up (specifically, EVGA RTX 3070 Ti FTW3). After running it through a few games (HZD, CP77, Control, FF15, and a few others), I think I might have to pop the fans off and redo the TIM and/or thermal pads/putty. On initial load to try things out, Control was hitting low-80C. Enabling Vsync and setting the frame rate limiter to 60fps got the temps to settle down. So now, I've limited frame rates across the board to 60fps to see how things shake out:
Cyberpunk 2077: mid to upper 70C
Horizon Zero Dawn: upper 70s (peaking out at 79C)
Control: ~67C was the peak
FF15: again, upper 70s (peaked to 79C)
Mechwarrior 5: peaked to 81C in the dropship with a full load of shiny mechs
There were a few that I tried out that hung out in the 60C or less neighborhood (but those were things like CoD: Cold War, Monster Hunter World, and Wasteland 3)...but overall, things seem pretty warm.
And this is after I set up a fairly aggressive fan profile with MSI Afterburner. I might also look into undervolting the thing a smidge...
My 3080 runs loud and hovers around 68C. Seems normal for this gen?
StragintDo Not GiftAlways DeclinesRegistered Userregular
My 3080 ti is sitting around 76C to 81C while playing Back 4 Blood. Should I be concerned? I don't really know anything about optimizing video card use.
PSN: Reaper_Stragint, Steam: DoublePitstoChesty
What is the point of being alive if you don't at least try to do something remarkable? ~ Mario Novak
I never fear death or dyin', I only fear never trying.
If you want the temperatures cooler you can always adjust the fan curve in something like MSI Afterburner to ramp up to higher speeds sooner. By default the fan curves are optimizing low noise over low temps. You can also look into undervolting which works extremely well with the 30xx series of cards.
FWIW a watercooled 3080 rarely goes above the 65 under heavy load.
Well, the 3070ti showed up (specifically, EVGA RTX 3070 Ti FTW3). After running it through a few games (HZD, CP77, Control, FF15, and a few others), I think I might have to pop the fans off and redo the TIM and/or thermal pads/putty. On initial load to try things out, Control was hitting low-80C. Enabling Vsync and setting the frame rate limiter to 60fps got the temps to settle down. So now, I've limited frame rates across the board to 60fps to see how things shake out:
Cyberpunk 2077: mid to upper 70C
Horizon Zero Dawn: upper 70s (peaking out at 79C)
Control: ~67C was the peak
FF15: again, upper 70s (peaked to 79C)
Mechwarrior 5: peaked to 81C in the dropship with a full load of shiny mechs
There were a few that I tried out that hung out in the 60C or less neighborhood (but those were things like CoD: Cold War, Monster Hunter World, and Wasteland 3)...but overall, things seem pretty warm.
And this is after I set up a fairly aggressive fan profile with MSI Afterburner. I might also look into undervolting the thing a smidge...
You probably saw it in the MWO/MW5/BT thread already, but someone noticed that MW5 menus tends to drive the framerate high which lead to the video cards getting warm/hot. I limited my 3060 Ti to 60FPS and the fans never got loud after that (I don't have any fancy software installed to monitor the temps).
Yeah, I've been super careful about getting a frame rate limiter for the menu (if one's available...if there's not one available, I'll just use Vsync) enabled ASAP. Probably the highest frame rate I was seeing was in Phasmophobia (250+)...so I Vsync'd that immediately.
Those temperatures are normal for modern high end video cards.
Okay, cool. I've been used to seeing upper-50s/mid-60s with my old 2070 Super, so seeing 80s was getting me a little concerned. Didn't help that Googling about it resulted in the usual: "It's fine/broken/needs a little repasting/my cousin's 1050ti never got near 50c while playing CSGO/etc..." blend of answers you'd expect.
| Origin/R*SC: Ein7919 | Battle.net: Erlkonig#1448 | XBL: Lexicanum | Steam: Der Erlkönig (the umlaut is important) |
Doesn't taking off the heatsink to redo the thermal paste void the warranty? I remember reading that in an article about a guy who found a tip of a glove in his. The manufacturer tried to claim that he voided the warranty till he showed them what they did.
Doesn't taking off the heatsink to redo the thermal paste void the warranty? I remember reading that in an article about a guy who found a tip of a glove in his. The manufacturer tried to claim that he voided the warranty till he showed them what they did.
Googling strikes again! According to what I could find, removing the heatsink is:
1) Safe and totally won't void the warranty
and 2) Totally will void the warranty
| Origin/R*SC: Ein7919 | Battle.net: Erlkonig#1448 | XBL: Lexicanum | Steam: Der Erlkönig (the umlaut is important) |
+7
-Loki-Don't pee in my mouth and tell me it's raining.Registered Userregular
Image linked because it's gigantic, but someone noticed some blue inside his 6700XT when he was installing it and popped off the heat sink only to discover the guy on the production line at PowerColor forgot to take the tape off the thermal strips.
Doesn't taking off the heatsink to redo the thermal paste void the warranty? I remember reading that in an article about a guy who found a tip of a glove in his. The manufacturer tried to claim that he voided the warranty till he showed them what they did.
It depends on the company. EVGA is fine as long as you put it back before sending it in for service, I know some others are less forgiving
+1
jungleroomxIt's never too many graves, it's always not enough shovelsRegistered Userregular
I've had this Samsung 850 EVO SSD 500GB for a few years now. Its written 68 TB of data. Samsung magician says its still good, but I worry about it. No idea how long these things are supposed to last.
Samsung warranty on 850 EVO 500GB is 5 years or 150TB
So dumbass me ordered custom dual wrapped 8pin PCIe cables for the wrong type of Seasonic power supply from Ensourced (all black cables, connectors, and cable combs). If anyone has a Seasonic power supply that's a Type 1 on the Ensourced chart and can use them let me know. They're of absolutely no use to me.
Live and learn to double or triple check when ordering something very custom that isn't returnable matches the specs I need.
Benchmarks of the 'Rembrandt' sub-45W Zen3+/RDNA2 core APU have been leaked; 8c/16t Zen3+ (V-cache not confirmed and unlikely IMO) plus 12 RDNA2 cores, DDR5 support.
GPU performance is around the 1050GTX/R560 level. This is going to be enough GPU for a lot of people, and it's going to kick the stool out from under what remains the low end video card market. My #2 nephew is 10 in just under a year and I would be super excited to get him a miniPC based one something like that.
Heck, if I put an NVME drive and 32GB of RAM in, it'd last him into his teens, and then still be good enough for his little brother after that.
+2
OrcaAlso known as EspressosaurusWrexRegistered Userregular
Wow. Yeah, looks like we're going back to the bad old days of pay a kidney for a videocard because the built-in one is good enough for non-enthusiast use.
I wonder if it might be enough to siphon more market share from Intel in the laptop market. Intel still dominates it.
The low end market has been dead for a couple generations anyway. And the shortages were just the nail in the coffin
+1
GnomeTankWhat the what?Portland, OregonRegistered Userregular
Apple is getting in to the game now as well. The GPU in the new M1 Max is supposed to be pretty beefy, with something equivalent to 4096 Nvidia CU's but using less power. If Intel starts sticking DG2 based GPU's in their CPU packages at the lower end it may not be the most terrible thing in the world for a discrete GPU to sort of die on the low end. It moves our cheese to be sure, but if the performance ends up roughly equivalent at the low end with decent power efficiency, that seems fine to me?
-Loki-Don't pee in my mouth and tell me it's raining.Registered Userregular
edited October 2021
From what I've seen, the M1 benchmarks at 1050/R960 levels as well. Kind of funny reading the benchmarks for it now when 6 months ago a Mac guy at work was going on and on about how the M1 was blowing away 'gaming PC's'.
I like the upgrade path you get with an AMP APU. Like, if you need a PC to get you going but you really want something you can play games on but can't afford that yet, with the 5700G you can build a 1080P capable PC that will get you by until you can afford a video card. Even then, you don't need to do anything other than install the video card. The 5700G isn't quite up to the standards of the 5800X, but it's close enough that with a good video card you're not going to see too much difference. The new 6000 series APU's sound like they'll be even better for that.
I expect that these beefy dies like the M1 max or Intel sapphire rapids + hbm m require them being fully soldered to the motherboard. Mainly due to how they integrate all the memory components to get the amazing memory bandwidth.
0
OrcaAlso known as EspressosaurusWrexRegistered Userregular
I don’t like the trend towards integration—it means we can’t as easily mix and match our components and is another step towards PCs just being a SOC in a tower, rather than easily upgrade able over time
+1
jungleroomxIt's never too many graves, it's always not enough shovelsRegistered Userregular
Apple is getting in to the game now as well. The GPU in the new M1 Max is supposed to be pretty beefy, with something equivalent to 4096 Nvidia CU's but using less power. If Intel starts sticking DG2 based GPU's in their CPU packages at the lower end it may not be the most terrible thing in the world for a discrete GPU to sort of die on the low end. It moves our cheese to be sure, but if the performance ends up roughly equivalent at the low end with decent power efficiency, that seems fine to me?
The M1's GPU does half the TFLOPS of a 3080, so I want to see benchmarks. I know FLOPS aren't exactly the be-all end-all of performance measuring, but when you go from 19+ to 10, well, I feel like pressing X to doubt.
I'm sure they did what every other company does and found the right benchmark.
jungleroomx on
+2
GnomeTankWhat the what?Portland, OregonRegistered Userregular
From what I've seen, the M1 benchmarks at 1050/R960 levels as well. Kind of funny reading the benchmarks for it now when 6 months ago a Mac guy at work was going on and on about how the M1 was blowing away 'gaming PC's'.
The M1 Max in the new MacBook Pro? I've seen closer to 3060 levels (obviously without raytracing). Be interesting to see what the independent benchmarks confirm, but it's got a lot of execution units and a unified memory architecture. On paper it should be easily on par with the current consoles and probably a bit above that. Not mind blowing, but for an SoC GPU, with a pretty nice power profile? It's not nothing.
From what I've seen, the M1 benchmarks at 1050/R960 levels as well. Kind of funny reading the benchmarks for it now when 6 months ago a Mac guy at work was going on and on about how the M1 was blowing away 'gaming PC's'.
The M1 Max in the new MacBook Pro? I've seen closer to 3060 levels (obviously without raytracing). Be interesting to see what the independent benchmarks confirm, but it's got a lot of execution units and a unified memory architecture. On paper it should be easily on par with the current consoles and probably a bit above that. Not mind blowing, but for an SoC GPU, with a pretty nice power profile? It's not nothing.
I don’t like the trend towards integration—it means we can’t as easily mix and match our components and is another step towards PCs just being a SOC in a tower, rather than easily upgrade able over time
That's why I don't like the trend of moving so many components onto the Mobo as well as the stepdown transformers for ATX12V
+1
GnomeTankWhat the what?Portland, OregonRegistered Userregular
From what I've seen, the M1 benchmarks at 1050/R960 levels as well. Kind of funny reading the benchmarks for it now when 6 months ago a Mac guy at work was going on and on about how the M1 was blowing away 'gaming PC's'.
The M1 Max in the new MacBook Pro? I've seen closer to 3060 levels (obviously without raytracing). Be interesting to see what the independent benchmarks confirm, but it's got a lot of execution units and a unified memory architecture. On paper it should be easily on par with the current consoles and probably a bit above that. Not mind blowing, but for an SoC GPU, with a pretty nice power profile? It's not nothing.
For a $3000+ laptop it better be not nothing.
The price, nor it being Apple, are why I find it cool or exciting. It's a good thing, overall, that something other than x86 + Nvidia/AMD is starting to push performance and efficiency. It's good for the long term health of the industry and our hobby.
From what I've seen, the M1 benchmarks at 1050/R960 levels as well. Kind of funny reading the benchmarks for it now when 6 months ago a Mac guy at work was going on and on about how the M1 was blowing away 'gaming PC's'.
The M1 Max in the new MacBook Pro? I've seen closer to 3060 levels (obviously without raytracing). Be interesting to see what the independent benchmarks confirm, but it's got a lot of execution units and a unified memory architecture. On paper it should be easily on par with the current consoles and probably a bit above that. Not mind blowing, but for an SoC GPU, with a pretty nice power profile? It's not nothing.
For a $3000+ laptop it better be not nothing.
The price, nor it being Apple, are why I find it cool or exciting. It's a good thing, overall, that something other than x86 + Nvidia/AMD is starting to push performance and efficiency. It's good for the long term health of the industry and our hobby.
Anything Apple touches or makes is not good for the industry or our hobby.
I say this as someone with an iPhone sitting right next to me.
Apple represents the death of our hobby.
jungleroomx on
+8
jungleroomxIt's never too many graves, it's always not enough shovelsRegistered Userregular
And to be clear, it's not just their walled garden garbage.
It's their influence on the industries they touch towards unfixable, unupgradable, disposable monolithic computing equipment and mountains of e-waste that results.
0
-Loki-Don't pee in my mouth and tell me it's raining.Registered Userregular
From what I've seen, the M1 benchmarks at 1050/R960 levels as well. Kind of funny reading the benchmarks for it now when 6 months ago a Mac guy at work was going on and on about how the M1 was blowing away 'gaming PC's'.
The M1 Max in the new MacBook Pro? I've seen closer to 3060 levels (obviously without raytracing). Be interesting to see what the independent benchmarks confirm, but it's got a lot of execution units and a unified memory architecture. On paper it should be easily on par with the current consoles and probably a bit above that. Not mind blowing, but for an SoC GPU, with a pretty nice power profile? It's not nothing.
Well, he was going on about it over 6 months ago (before we went into our super long lockdown), and the new Macbook pro released what, a few days ago?
So no, not the Pro or Max, just the M1.
-Loki- on
0
GnomeTankWhat the what?Portland, OregonRegistered Userregular
From what I've seen, the M1 benchmarks at 1050/R960 levels as well. Kind of funny reading the benchmarks for it now when 6 months ago a Mac guy at work was going on and on about how the M1 was blowing away 'gaming PC's'.
The M1 Max in the new MacBook Pro? I've seen closer to 3060 levels (obviously without raytracing). Be interesting to see what the independent benchmarks confirm, but it's got a lot of execution units and a unified memory architecture. On paper it should be easily on par with the current consoles and probably a bit above that. Not mind blowing, but for an SoC GPU, with a pretty nice power profile? It's not nothing.
For a $3000+ laptop it better be not nothing.
The price, nor it being Apple, are why I find it cool or exciting. It's a good thing, overall, that something other than x86 + Nvidia/AMD is starting to push performance and efficiency. It's good for the long term health of the industry and our hobby.
Anything Apple touches or makes is not good for the industry or our hobby.
I say this as someone with an iPhone sitting right next to me.
Apple represents the death of our hobby.
This just reads as chicken littleing to me and seems a tiny bit hyperbolic. You're entitled to your opinion but Apple finally pushing ARM processors in to the realm of serious performance is overall a good thing. Apple is a mega corporation, they have their investors in mind at all times, as all large corporations do. Doesn't change the fact that them pushing a processor based on a relatively open and available instruction set architecture to near desktop performance is a good thing. It's no RiscV, but it's better than only having one player (x86) in the performance conversation. If Apple can do it, and they clearly can, that means others can as well.
From what I've seen, the M1 benchmarks at 1050/R960 levels as well. Kind of funny reading the benchmarks for it now when 6 months ago a Mac guy at work was going on and on about how the M1 was blowing away 'gaming PC's'.
The M1 Max in the new MacBook Pro? I've seen closer to 3060 levels (obviously without raytracing). Be interesting to see what the independent benchmarks confirm, but it's got a lot of execution units and a unified memory architecture. On paper it should be easily on par with the current consoles and probably a bit above that. Not mind blowing, but for an SoC GPU, with a pretty nice power profile? It's not nothing.
For a $3000+ laptop it better be not nothing.
The price, nor it being Apple, are why I find it cool or exciting. It's a good thing, overall, that something other than x86 + Nvidia/AMD is starting to push performance and efficiency. It's good for the long term health of the industry and our hobby.
Anything Apple touches or makes is not good for the industry or our hobby.
I say this as someone with an iPhone sitting right next to me.
Apple represents the death of our hobby.
This just reads as chicken littleing to me and seems a tiny bit hyperbolic. You're entitled to your opinion but Apple finally pushing ARM processors in to the realm of serious performance is overall a good thing. Apple is a mega corporation, they have their investors in mind at all times, as all large corporations do. Doesn't change the fact that them pushing a processor based on a relatively open and available instruction set architecture to near desktop performance is a good thing. It's no RiscV, but it's better than only having one player (x86) in the performance conversation. If Apple can do it, and they clearly can, that means others can as well.
And they've also shown all of these other companies how to make it locked down and the direct opposite of open.
What we're doing here, in this thread, is the result of inertia from business decisions made in the 1980's from fledgling corporations. When X86 gets replaced, it's going to be walled garden city and all this wonderful stuff we know now is going away.
Showing a reasonable alternative architecture that will allow hardware walled gardens to pop up is the worst thing that could happen.
jungleroomx on
+1
minor incidentexpert in a dying fieldnjRegistered Userregular
I’m afraid I don’t see any real correlation (or causation for that matter) between moving on from X86 and everything going “walled garden”.
I guess I’m also not clear what you mean. If you’re just talking about discrete hardware vs SOCs, I mean, that sort of thing has existed forever (and been good enough for 90% of people just as long) and the enthusiast/gaming hardware market is still the area of pc hardware seeing the most growth.
It just sort of feels like the same “death of the enthusiast pc market” thing I’ve heard predicted for 20 years now.
What makes Apple’s successful foray into ARM architecture for desktop hardware a black swan compared to everything else that’s come before?
Everything looks beautiful when you're young and pretty
+1
jungleroomxIt's never too many graves, it's always not enough shovelsRegistered Userregular
I’m afraid I don’t see any real correlation (or causation for that matter) between moving on from X86 and everything going “walled garden”.
I guess I’m also not clear what you mean. If you’re just talking about discrete hardware vs SOCs, I mean, that sort of thing has existed forever (and been good enough for 90% of people just as long) and the enthusiast/gaming hardware market is still the area of pc hardware seeing the most growth.
It just sort of feels like the same “death of the enthusiast pc market” thing I’ve heard predicted for 20 years now.
What makes Apple’s successful foray into ARM architecture for desktop hardware a black swan compared to everything else that’s come before?
What made Apples successful foray into smartphones, laptops, services, and maintenance different than the numerous companies that came before?
Other than a trimmed down line of anticompetitive walled garden ecosystems and mountains of ewaste, nickel and diming customers with monthly charges, suppression of right to repair, abusive and monopolistic practices in their app stores that every other company has worked tirelessly to duplicate.
Without the baggage of old deals hanging on, we’ve seen how tech companies act. Why do we think it’s going to be different and they won’t tend to their worst like they do with everything else?
jungleroomx on
+1
jungleroomxIt's never too many graves, it's always not enough shovelsRegistered Userregular
edited October 2021
Also, I dont have hope of ARM being anything like it is now if Nvidias acquisition goes through.
jungleroomx on
0
minor incidentexpert in a dying fieldnjRegistered Userregular
I’m afraid I don’t see any real correlation (or causation for that matter) between moving on from X86 and everything going “walled garden”.
I guess I’m also not clear what you mean. If you’re just talking about discrete hardware vs SOCs, I mean, that sort of thing has existed forever (and been good enough for 90% of people just as long) and the enthusiast/gaming hardware market is still the area of pc hardware seeing the most growth.
It just sort of feels like the same “death of the enthusiast pc market” thing I’ve heard predicted for 20 years now.
What makes Apple’s successful foray into ARM architecture for desktop hardware a black swan compared to everything else that’s come before?
What made Apples successful foray into smartphones, laptops, services, and maintenance different than the numerous companies that came before?
Other than a trimmed down line of anticompetitive walled garden ecosystems and mountains of ewaste, nickel and diming customers with monthly charges, suppression of right to repair, abusive and monopolistic practices in their app stores that every other company has worked tirelessly to duplicate.
Again, I’m just not seeing what you’re getting at here. That’s always been Apple’s MO. It hasn’t really affected the enthusiast PC hardware market to any degree. And it’s not like any of that was new ground that Apple trailblazed in the smartphone market either, we just had smaller, shittier ecosystems that were just as walled in. I’m not seeing the doom and gloom on the horizon here, man.
Everything looks beautiful when you're young and pretty
0
jungleroomxIt's never too many graves, it's always not enough shovelsRegistered Userregular
I’m afraid I don’t see any real correlation (or causation for that matter) between moving on from X86 and everything going “walled garden”.
I guess I’m also not clear what you mean. If you’re just talking about discrete hardware vs SOCs, I mean, that sort of thing has existed forever (and been good enough for 90% of people just as long) and the enthusiast/gaming hardware market is still the area of pc hardware seeing the most growth.
It just sort of feels like the same “death of the enthusiast pc market” thing I’ve heard predicted for 20 years now.
What makes Apple’s successful foray into ARM architecture for desktop hardware a black swan compared to everything else that’s come before?
What made Apples successful foray into smartphones, laptops, services, and maintenance different than the numerous companies that came before?
Other than a trimmed down line of anticompetitive walled garden ecosystems and mountains of ewaste, nickel and diming customers with monthly charges, suppression of right to repair, abusive and monopolistic practices in their app stores that every other company has worked tirelessly to duplicate.
Again, I’m just not seeing what you’re getting at here. That’s always been Apple’s MO. It hasn’t really affected the enthusiast PC hardware market to any degree. And it’s not like any of that was new ground that Apple trailblazed in the smartphone market either, we just had smaller, shittier ecosystems that were just as walled in. I’m not seeing the doom and gloom on the horizon here, man.
It has been Apples MO, and they just showed everyone how to do it with computers.
I don’t have faith that X86s eventual replacement will be this open. I think you’re going to get completely different architectures and I think the mindset of locking people into an ecosystem is going to be the norm, because that’s what they’re already doing with everything else.
You can call it doom and gloom all you want. I call it reading the room.
+2
minor incidentexpert in a dying fieldnjRegistered Userregular
How did they just show everyone that? What they’re doing now is no different than what they’ve been doing for the last 10 years or so. The only difference is a change in processor architecture. I still don’t see how that impacts the enthusiast market at all. It hasn’t to date, and Apple making well performing hardware based on their own silicon doesn’t feel like it changes that equation at all, really.
Everything looks beautiful when you're young and pretty
0
OrcaAlso known as EspressosaurusWrexRegistered Userregular
Everyone wants a piece of that sweet sweet subscription money.
The open PC is an accident of poor business choices that spiraled beyond control. Will it hold? I hope so. Cellphones, consoles, and Apple show it isn’t a law of nature that to be successful you must be open. If VR/AR takes off there is no guarantee those computing/display devices will be open.
Microsoft is moving towards a model of requiring TPM so that the boot loader can be secured. It’s another wedge into locking down the PC platform. It’s not a death knell, but you betcha I’m watching closely—not that I’ll have a choice if and when things do close down.
Continued steps toward integration are both inevitable and ease the transition into a closed system.
Here’s hoping it’s just chicken littering. Let’s reconvene in 10-15 years and see where we stand.
+1
minor incidentexpert in a dying fieldnjRegistered Userregular
I totally get that type of skepticism, and that’s fair. I just feel like every six months there’s a new thinkpiece about how X is the death of enthusiast pc hardware.
Sometimes it’s apple, sometimes it’s smartphones, sometimes it’s chrome books, sometimes it’s tablets, sometimes it’s drm, sometimes it’s the downfall of cryptocurrency.
Nothing lasts forever, but I’m just not sure anything that exists right now is going to be the straw that breaks the camel’s back.
I’ve been wrong before, though, so who knows.
Everything looks beautiful when you're young and pretty
Posts
Look into UVing, but these cards just do run kinda hot. I was pegged at 83 the whole time in control on my 3080
You probably saw it in the MWO/MW5/BT thread already, but someone noticed that MW5 menus tends to drive the framerate high which lead to the video cards getting warm/hot. I limited my 3060 Ti to 60FPS and the fans never got loud after that (I don't have any fancy software installed to monitor the temps).
Steam: betsuni7
My 3080 runs loud and hovers around 68C. Seems normal for this gen?
What is the point of being alive if you don't at least try to do something remarkable? ~ Mario Novak
I never fear death or dyin', I only fear never trying.
FWIW a watercooled 3080 rarely goes above the 65 under heavy load.
Yeah, I've been super careful about getting a frame rate limiter for the menu (if one's available...if there's not one available, I'll just use Vsync) enabled ASAP. Probably the highest frame rate I was seeing was in Phasmophobia (250+)...so I Vsync'd that immediately.
Okay, cool. I've been used to seeing upper-50s/mid-60s with my old 2070 Super, so seeing 80s was getting me a little concerned. Didn't help that Googling about it resulted in the usual: "It's fine/broken/needs a little repasting/my cousin's 1050ti never got near 50c while playing CSGO/etc..." blend of answers you'd expect.
Steam: betsuni7
Googling strikes again! According to what I could find, removing the heatsink is:
1) Safe and totally won't void the warranty
and 2) Totally will void the warranty
Image linked because it's gigantic, but someone noticed some blue inside his 6700XT when he was installing it and popped off the heat sink only to discover the guy on the production line at PowerColor forgot to take the tape off the thermal strips.
It depends on the company. EVGA is fine as long as you put it back before sending it in for service, I know some others are less forgiving
Samsung warranty on 850 EVO 500GB is 5 years or 150TB
Live and learn to double or triple check when ordering something very custom that isn't returnable matches the specs I need.
GPU performance is around the 1050GTX/R560 level. This is going to be enough GPU for a lot of people, and it's going to kick the stool out from under what remains the low end video card market. My #2 nephew is 10 in just under a year and I would be super excited to get him a miniPC based one something like that.
Heck, if I put an NVME drive and 32GB of RAM in, it'd last him into his teens, and then still be good enough for his little brother after that.
I wonder if it might be enough to siphon more market share from Intel in the laptop market. Intel still dominates it.
I like the upgrade path you get with an AMP APU. Like, if you need a PC to get you going but you really want something you can play games on but can't afford that yet, with the 5700G you can build a 1080P capable PC that will get you by until you can afford a video card. Even then, you don't need to do anything other than install the video card. The 5700G isn't quite up to the standards of the 5800X, but it's close enough that with a good video card you're not going to see too much difference. The new 6000 series APU's sound like they'll be even better for that.
The M1's GPU does half the TFLOPS of a 3080, so I want to see benchmarks. I know FLOPS aren't exactly the be-all end-all of performance measuring, but when you go from 19+ to 10, well, I feel like pressing X to doubt.
I'm sure they did what every other company does and found the right benchmark.
The M1 Max in the new MacBook Pro? I've seen closer to 3060 levels (obviously without raytracing). Be interesting to see what the independent benchmarks confirm, but it's got a lot of execution units and a unified memory architecture. On paper it should be easily on par with the current consoles and probably a bit above that. Not mind blowing, but for an SoC GPU, with a pretty nice power profile? It's not nothing.
For a $3000+ laptop it better be not nothing.
That's why I don't like the trend of moving so many components onto the Mobo as well as the stepdown transformers for ATX12V
The price, nor it being Apple, are why I find it cool or exciting. It's a good thing, overall, that something other than x86 + Nvidia/AMD is starting to push performance and efficiency. It's good for the long term health of the industry and our hobby.
Anything Apple touches or makes is not good for the industry or our hobby.
I say this as someone with an iPhone sitting right next to me.
Apple represents the death of our hobby.
It's their influence on the industries they touch towards unfixable, unupgradable, disposable monolithic computing equipment and mountains of e-waste that results.
Well, he was going on about it over 6 months ago (before we went into our super long lockdown), and the new Macbook pro released what, a few days ago?
So no, not the Pro or Max, just the M1.
This just reads as chicken littleing to me and seems a tiny bit hyperbolic. You're entitled to your opinion but Apple finally pushing ARM processors in to the realm of serious performance is overall a good thing. Apple is a mega corporation, they have their investors in mind at all times, as all large corporations do. Doesn't change the fact that them pushing a processor based on a relatively open and available instruction set architecture to near desktop performance is a good thing. It's no RiscV, but it's better than only having one player (x86) in the performance conversation. If Apple can do it, and they clearly can, that means others can as well.
And they've also shown all of these other companies how to make it locked down and the direct opposite of open.
What we're doing here, in this thread, is the result of inertia from business decisions made in the 1980's from fledgling corporations. When X86 gets replaced, it's going to be walled garden city and all this wonderful stuff we know now is going away.
Showing a reasonable alternative architecture that will allow hardware walled gardens to pop up is the worst thing that could happen.
I guess I’m also not clear what you mean. If you’re just talking about discrete hardware vs SOCs, I mean, that sort of thing has existed forever (and been good enough for 90% of people just as long) and the enthusiast/gaming hardware market is still the area of pc hardware seeing the most growth.
It just sort of feels like the same “death of the enthusiast pc market” thing I’ve heard predicted for 20 years now.
What makes Apple’s successful foray into ARM architecture for desktop hardware a black swan compared to everything else that’s come before?
What made Apples successful foray into smartphones, laptops, services, and maintenance different than the numerous companies that came before?
Other than a trimmed down line of anticompetitive walled garden ecosystems and mountains of ewaste, nickel and diming customers with monthly charges, suppression of right to repair, abusive and monopolistic practices in their app stores that every other company has worked tirelessly to duplicate.
Without the baggage of old deals hanging on, we’ve seen how tech companies act. Why do we think it’s going to be different and they won’t tend to their worst like they do with everything else?
Again, I’m just not seeing what you’re getting at here. That’s always been Apple’s MO. It hasn’t really affected the enthusiast PC hardware market to any degree. And it’s not like any of that was new ground that Apple trailblazed in the smartphone market either, we just had smaller, shittier ecosystems that were just as walled in. I’m not seeing the doom and gloom on the horizon here, man.
It has been Apples MO, and they just showed everyone how to do it with computers.
I don’t have faith that X86s eventual replacement will be this open. I think you’re going to get completely different architectures and I think the mindset of locking people into an ecosystem is going to be the norm, because that’s what they’re already doing with everything else.
You can call it doom and gloom all you want. I call it reading the room.
The open PC is an accident of poor business choices that spiraled beyond control. Will it hold? I hope so. Cellphones, consoles, and Apple show it isn’t a law of nature that to be successful you must be open. If VR/AR takes off there is no guarantee those computing/display devices will be open.
Microsoft is moving towards a model of requiring TPM so that the boot loader can be secured. It’s another wedge into locking down the PC platform. It’s not a death knell, but you betcha I’m watching closely—not that I’ll have a choice if and when things do close down.
Continued steps toward integration are both inevitable and ease the transition into a closed system.
Here’s hoping it’s just chicken littering. Let’s reconvene in 10-15 years and see where we stand.
Sometimes it’s apple, sometimes it’s smartphones, sometimes it’s chrome books, sometimes it’s tablets, sometimes it’s drm, sometimes it’s the downfall of cryptocurrency.
Nothing lasts forever, but I’m just not sure anything that exists right now is going to be the straw that breaks the camel’s back.
I’ve been wrong before, though, so who knows.