How did they just show everyone that? What they’re doing now is no different than what they’ve been doing for the last 10 years or so. The only difference is a change in processor architecture. I still don’t see how that impacts the enthusiast market at all. It hasn’t to date, and Apple making well performing hardware based on their own silicon doesn’t feel like it changes that equation at all, really.
There hasn't been a processer besides X86 that could compare with X86 until the M1 stuff. Itanium withered in obscurity and died because it sucked. ARM as a whole has never taken off as a mass consumer market product because its used for low power things. RISC is the same. PowerPC never had a chance. X86 has survived since the 80's because nothing else is as good, and the M1 is the first chip outside X86 that actually competes.
Having an alternative architecture come up through the ranks to push X86 would be one thing, but I think what we're going to see is all of these companies (Intel, AMD, and Nvidia/ARM) make a play for the replacement of X86, especially since all of them would have the capability to push CPU's and GPU's for their own systems. Because that's how companies work now. This isn't going to play out like it did in the 1980's when it was the wild wild west and everyone and their mom could make a 3.1 mhz chip with transistors you could literally see in a regular microscope and Intel signed production deals with Cyrix and AMD.
And unlike before, with previous attempts to replace X86, all that has to happen now is a product has to be sold with it on board. Like, for instance, a laptop.
Don't get me wrong, I'm aware it's going to be a slow process. It's not going to be in 2-3 years. You couldn't even get the fabrication plants set up to make the chips in that time... but this ability to lock in consumers to a hardware ecosystem feels like the next step. It's all speculation, but nothing these companies have done in the past decade makes me feel warm and fuzzy about it.
Microsoft is moving towards a model of requiring TPM so that the boot loader can be secured. It’s another wedge into locking down the PC platform. It’s not a death knell, but you betcha I’m watching closely—not that I’ll have a choice if and when things do close down.
Nah. It's been over a decade since TPM was introduced. The EFF and everyone made a huge deal out of it, and it stuck around anyway, and literally nothing bad has happened. Even hardcore free software folks like the Debian team assert that it's a tool for security, not a gambit for Microsoft to extinguish the (incredibly tiny) alternative OS user base.
Microsoft has continued to take advantage of hardware security features since then, and free software developers largely have not. The upshot is that desktop Linux is effectively a decade behind macOS and Windows in security. It's really frustrating.
Yeah apparently made out of that same rubberized stuff as their bar mats. They were too cool to pass up, especially since it's GN and they fight the good fight.
+3
jungleroomxIt's never too many graves, it's always not enough shovelsRegistered Userregular
I don’t like the trend towards integration—it means we can’t as easily mix and match our components and is another step towards PCs just being a SOC in a tower, rather than easily upgrade able over time
Parts of it I am ok with, like not having to buy drop in NICs or soundcards, but if cryptocurrencies and manufacturing/shipping delays continue the way they have been doing, and integrated video goes from being less than dogshit to actually kinda almost ok for casual gaming, and stuff like Gamepass keeps growing in popularity, I think there will definitely be a class of gaming PC that is sold in the future that is essentially just a console that runs Windows and plugs into a monitor instead of a TV. Maybe Valve was just a few years too early with their "Steam Machines" initiative.
With how hostile Apple continues to be towards "core" gaming I wouldn't look at the hardware in that light at all. They want to increasingly lean on their mobile gaming ecosystem and that means zero incentive for premium products.
0
-Loki-Don't pee in my mouth and tell me it's raining.Registered Userregular
With how hostile Apple continues to be towards "core" gaming I wouldn't look at the hardware in that light at all. They want to increasingly lean on their mobile gaming ecosystem and that means zero incentive for premium products.
I was curious about gaming on Apple, so I did a quick search for recommended games. The top games to play in 2021 included Portal 2 and Counter Strike Global Offensive.
Now, they're fantastic games, but Portal 2 is 10 years old and Global Offensive is 8 years old.
I'm guessing the available games on Mac situation is pretty dire?
+1
OrcaAlso known as EspressosaurusWrexRegistered Userregular
With how hostile Apple continues to be towards "core" gaming I wouldn't look at the hardware in that light at all. They want to increasingly lean on their mobile gaming ecosystem and that means zero incentive for premium products.
I was curious about gaming on Apple, so I did a quick search for recommended games. The top games to play in 2021 included Portal 2 and Counter Strike Global Offensive.
Now, they're fantastic games, but Portal 2 is 10 years old and Global Offensive is 8 years old.
I'm guessing the available games on Mac situation is pretty dire?
Their GPU situation has been pretty dire for years now.
+2
jungleroomxIt's never too many graves, it's always not enough shovelsRegistered Userregular
With how hostile Apple continues to be towards "core" gaming I wouldn't look at the hardware in that light at all. They want to increasingly lean on their mobile gaming ecosystem and that means zero incentive for premium products.
I was curious about gaming on Apple, so I did a quick search for recommended games. The top games to play in 2021 included Portal 2 and Counter Strike Global Offensive.
Now, they're fantastic games, but Portal 2 is 10 years old and Global Offensive is 8 years old.
I'm guessing the available games on Mac situation is pretty dire?
It's non-existent.
You don't buy an Apple to game. That's one of the few truisms we still have left in the industry. Windows is for games, Linux is catching up quickly, but Apple is for app store arcade games pretty much exclusively. As a company they've shown utter indifference to the gaming industry as a whole.
With how hostile Apple continues to be towards "core" gaming I wouldn't look at the hardware in that light at all. They want to increasingly lean on their mobile gaming ecosystem and that means zero incentive for premium products.
I was curious about gaming on Apple, so I did a quick search for recommended games. The top games to play in 2021 included Portal 2 and Counter Strike Global Offensive.
Now, they're fantastic games, but Portal 2 is 10 years old and Global Offensive is 8 years old.
I'm guessing the available games on Mac situation is pretty dire?
It's non-existent.
You don't buy an Apple to game. That's one of the few truisms we still have left in the industry. Windows is for games, Linux is catching up quickly, but Apple is for app store arcade games pretty much exclusively. As a company they've shown utter indifference to the gaming industry as a whole.
Pretty sure Apple are the largest “gaming” company by revenue aren’t they? If they can do that with just the App Store why would they the invest any more effort into the endeavor?
You can't game on apple because they don't support Vulkan or fully featured versions of OpenGL. They developed their own proprietary api called Metal and announced that they were deprecating OpenGL when Metal was released.
Further evidence that Apple are actively pursuing their proprietary walled garden. There are only 2 possible reasons for a player who is nothing like dominant in the gaming space to mandate a new API like this: either it's so phenomenally better than anything else available that it's worth doing it for the performance/feature advantage or you want games developed for MacOS not to run on any other platform.
+5
jungleroomxIt's never too many graves, it's always not enough shovelsRegistered Userregular
With how hostile Apple continues to be towards "core" gaming I wouldn't look at the hardware in that light at all. They want to increasingly lean on their mobile gaming ecosystem and that means zero incentive for premium products.
I was curious about gaming on Apple, so I did a quick search for recommended games. The top games to play in 2021 included Portal 2 and Counter Strike Global Offensive.
Now, they're fantastic games, but Portal 2 is 10 years old and Global Offensive is 8 years old.
I'm guessing the available games on Mac situation is pretty dire?
It's non-existent.
You don't buy an Apple to game. That's one of the few truisms we still have left in the industry. Windows is for games, Linux is catching up quickly, but Apple is for app store arcade games pretty much exclusively. As a company they've shown utter indifference to the gaming industry as a whole.
Pretty sure Apple are the largest “gaming” company by revenue aren’t they? If they can do that with just the App Store why would they the invest any more effort into the endeavor?
Mobile gaming only and exploiting creators on your App Store seems less like saying they’re the most profitable gaming company and more like a warning of what happens when you follow their business model.
And aside from a few exemplars, mobile gaming is a trash fire of exploitation, getting kids and adults addicted to gambling, and skinner boxes so complete they make console and PC gaming loot boxes seem like a less horrible business practice.
had another 'random' loss of power yesterday, just as I launched a game.
ran memtest86 and received no errors
some other google results brought up the "reliability monitor" feature, and it's showing several 'hardware error' flags, specifically livekernelevent 141. I can all-but prove these correspond with 'normal' freezes/crashes in Ancestors; there's one that might not.
this error code led me down a path of running furmark to stress test the gpu (Gigabyte-brand 1060 GTX 6 GB). It hit 80c in about 3 minutes, then slowly crept up to 88c. Furmark claimed my fan speed was steady at 70%. I stopped the test at minute 15, both to forestall the temps getting into the 90s and because the guide suggested I would have seen issues by that point. I did not see any artifacting (though tbh I don't know what I'm looking for, there). The info section reported no OpenGL errors. Obviously it didn't crash. No new flags appeared in the reliability monitor.
Further evidence that Apple are actively pursuing their proprietary walled garden. There are only 2 possible reasons for a player who is nothing like dominant in the gaming space to mandate a new API like this: either it's so phenomenally better than anything else available that it's worth doing it for the performance/feature advantage or you want games developed for MacOS not to run on any other platform.
Neither quite, Metal is the same API used in their iDevices now. And its going to be exactly the same GPU family, with M1 already in some iPads.
So rather than trying to be the minority platform that dips into ports from PC games, they are instead trying to pull from their huge mobile library. So you're looking at either free games with horrible whale traps, or smaller-scale cheap games, all oriented towards a majority of systems where people engage in brief spurts and only pay half-attention. They can talk about how their mobile GPUs can outpace an Xbox One now but they sure aren't making it easy for anything you would see on an XB1 to come over.
How did they just show everyone that? What they’re doing now is no different than what they’ve been doing for the last 10 years or so. The only difference is a change in processor architecture. I still don’t see how that impacts the enthusiast market at all. It hasn’t to date, and Apple making well performing hardware based on their own silicon doesn’t feel like it changes that equation at all, really.
There hasn't been a processer besides X86 that could compare with X86 until the M1 stuff. Itanium withered in obscurity and died because it sucked. ARM as a whole has never taken off as a mass consumer market product because its used for low power things. RISC is the same. PowerPC never had a chance. X86 has survived since the 80's because nothing else is as good, and the M1 is the first chip outside X86 that actually competes.
Having an alternative architecture come up through the ranks to push X86 would be one thing, but I think what we're going to see is all of these companies (Intel, AMD, and Nvidia/ARM) make a play for the replacement of X86, especially since all of them would have the capability to push CPU's and GPU's for their own systems. Because that's how companies work now. This isn't going to play out like it did in the 1980's when it was the wild wild west and everyone and their mom could make a 3.1 mhz chip with transistors you could literally see in a regular microscope and Intel signed production deals with Cyrix and AMD.
And unlike before, with previous attempts to replace X86, all that has to happen now is a product has to be sold with it on board. Like, for instance, a laptop.
Don't get me wrong, I'm aware it's going to be a slow process. It's not going to be in 2-3 years. You couldn't even get the fabrication plants set up to make the chips in that time... but this ability to lock in consumers to a hardware ecosystem feels like the next step. It's all speculation, but nothing these companies have done in the past decade makes me feel warm and fuzzy about it.
You know... I keep wondering if "architecture" is really the right word to get thrown around with X86. It's an instruction set. It's been implemented in radically different architectures over the years. There used to be something like a half dozen X86 vendors. AMD, Intel, Via, Cyrix, something called a "Winchip" and a "Rise Technology mP6". And this ecosystem of alternate x86 vendors only existed because of bizarre contractual and legal outcomes that arose from IBM losing control of the platform.
Now it's whittled down to Intel and AMD who, if I'm not mistaken, through assorted legal settlements have quasi perpetual licenses to each others assorted x86 extension IP?
But then you also have Windows as the relatively open OS, with roots in a pre-internet age where there was, relative to today at least, little risk in letting you run whatever software you wanted. If you got a virus, your computer wasn't networked, and you were just a full format away from getting back where you started after maybe a day's work.
It was always the openness and competitiveness of the x86 platform and it's operating systems that made it thrive and outshine other, theoretically technically superior, platforms. Even if, and it's a big if, you were able to compare apples to apples applications running on the M1 right now, and the M1 performed better, it is inconceivable to me that Apple would keep up the effort. It's another monopolized walled garden, and they'd actively compete on the technicals only long enough to lock in a market share with proprietary bullshit, and then they'd rest on their laurels. And if no one else is making M1 compatible hardware, you can pretty much go fuck yourself if you don't like it.
I'm not sure what the state of the art of M1 hacking is. I know it's ARM based, I don't know how well other operating systems that are designed to run on ARM can run on it. I don't know if "ARM based" are weasel words, and how much proprietary, ARM-incompatible crap it has. In general, for reasons I've never really investigated, you don't see a lot of broad cross compatibility among ARM software ecosystems. But maybe that's my own ignorance. I got curious and apparently you can install Android on a Raspberry Pi? It's just really awkward? Like a Hackintosh?
I donno. I have a really dismal outlook on the future of computing, and every which direction I look, I see the platform being turned into a walled garden. I have all kinds of conspiracy theories about how it could happen. And theories about how TSMC will be taken off the field. And theories about how those two forces will interweave to really fuck everyone.
It's concerned me enough to actually put my money where my mouth is and refuse to invest in TSMC, or any company reliant on TSMC. Been buying
But who knows. Maybe a relatively open ARM/Linux based future will preserve open computing, even if it's utility dwindles.
0
OrcaAlso known as EspressosaurusWrexRegistered Userregular
edited October 2021
ARM is just a family of instruction sets and associated core designs. Or you can license an instruction set and design your own like Apple's done.
There's still literally every other peripheral you need to deal with and Apple likes putting secure enclaves in everything (I mean, they do have a user benefit, but it also helps them keep their walled garden). So compatibility isn't as simple as just the instruction set. It's the entire platform you need to contend with, and that's a big, complicated beast.
the distinction between architecture/instruction set for x86 is a real thing, but kine of irrelevant for the point.
yes, an Intel 11900k is not the same architecture as a Pentium II 450Mhz processor. But they both run on the x86 instruction set.
When we talk about the inefficiency of the x86 architecture, we're really talking about it as an instruction set. So yes, they're different things, but for all practical purposes in how we talk about it, we talk about x86 as an architecture. It is kind of a distinction without a difference.
As for Macs and their walled garden. I'm typing this on a 13" M1 MacBook pro which is my favorite laptop I've ever used in terms of hardware, I just don't like MacOS very much. But this is a work machine so it is what I use for work.
Apple has always been a walled garden. But they've made allowances for when they needed to. When they switched to Intel, OS X was still a walled garden, but they had to allow Windows in boot camp because there was an app gap and some people needed Windows apps to do what they did. Allowing people to install Windows on their Intel Macs just got people buying more Macs. Then people would use OS X more, and the goal was to slowly get people using OS X instead of Windows over time, so when they needed to replace their Mac, they were buying another mac. And then maybe they didn't even need Windows by then, and were living in the Apple ecosystem entirely.
Then with the rise of the iPhone in North America, they were able to make that walled garden tighter. And they continue to do so to this day.
And Apple is not alone in this. Google is trying to do it with its own products as well, albeit much less successfully, but I guarantee you that if Google could figure out a way to take Android away from everything but the Pixel, they would.
Every company wants you to buy their products, and only their products. Apple just happens to be the most successful company at it.
the distinction between architecture/instruction set for x86 is a real thing, but kine of irrelevant for the point.
yes, an Intel 11900k is not the same architecture as a Pentium II 450Mhz processor. But they both run on the x86 instruction set.
When we talk about the inefficiency of the x86 architecture, we're really talking about it as an instruction set. So yes, they're different things, but for all practical purposes in how we talk about it, we talk about x86 as an architecture. It is kind of a distinction without a difference.
As for Macs and their walled garden. I'm typing this on a 13" M1 MacBook pro which is my favorite laptop I've ever used in terms of hardware, I just don't like MacOS very much. But this is a work machine so it is what I use for work.
Apple has always been a walled garden. But they've made allowances for when they needed to. When they switched to Intel, OS X was still a walled garden, but they had to allow Windows in boot camp because there was an app gap and some people needed Windows apps to do what they did. Allowing people to install Windows on their Intel Macs just got people buying more Macs. Then people would use OS X more, and the goal was to slowly get people using OS X instead of Windows over time, so when they needed to replace their Mac, they were buying another mac. And then maybe they didn't even need Windows by then, and were living in the Apple ecosystem entirely.
Then with the rise of the iPhone in North America, they were able to make that walled garden tighter. And they continue to do so to this day.
And Apple is not alone in this. Google is trying to do it with its own products as well, albeit much less successfully, but I guarantee you that if Google could figure out a way to take Android away from everything but the Pixel, they would.
Every company wants you to buy their products, and only their products. Apple just happens to be the most successful company at it.
Inefficiency of the instruction set is an issue, but not one that is relevant anymore. It was relevant when the decoder was enormous (because gigantic instructions) and took up a bunch of the silicon. These days? Eh, who gives a shit, it's all uops under the hood anyway. The decoder is a tiny part of the silicon. Still a pain in the ass for the poor bastards that have to write/read the assembly, but that doesn't matter to someone using the program. Or for most programmers for that matter.
So talking about efficiency when it comes to instruction sets has little point. There is a point to talking about efficiency between architectures however, and that's where we can start looking at the high power utilization to perform the same computation across parts--and for a surprisingly wide variety of cases, Apple has made something special when it comes to efficient-yet-speedy laptop processsors. Intel on the other hand is still hamstrung by an obsolete design and silicon fab tech, so it takes them a bunch more power to do the same computation. AMD is looking up, but their emphasis so far has been on high power, high performance chips where Apple isn't really competing yet.
Google will likely never take Android away from everything but the Pixel. It was their strategic choice to make sure platform owners like Microsoft and Apple couldn't lock them out of search/ads, and now they're making a bundle off the app store. It helps them that they are the only choice for other companies if they want to compete in the cellphone space. They will only consider walling that off if something huge changes in their strategy.
I donno, once upon a time, at a very surface level, it was explained to me that x86's instruction set would always result in a less efficient architecture than the ARM instruction set. It's about how deep of a pipeline the sophistication of certain x86 instructions simply require, versus ARM which keeps things simple and shallow. Sure, it may take multiple ARM instructions to accomplish what a single x86 instruction can accomplish. But ARM's entire pipeline benefits from not having such monolithic instructions in the first place.
I have no idea if this is still the state of the art. I have no idea if ARM has suffered the same feature creep that x86 suffered. I have no idea if somehow the pipeline depth issues of x86 comparative to ARM really just all got abstracted away somehow. I don't know that I've ever seen a honest to god comparison if a truly demanding, high performance application, cross compiled for ARM and x86 running on as equivalent hardware as you could get when you are comparing architecture.
Instead, x86 and ARM just kind of went their different ways. The advantages of ARM in the mobile, low powered SOC space were assumed. And the advantages of x86 in the high performance desktop space were assumed. And M1 is really only challenging in the weird, almost not applicable to anything else middle space of boutique $2000-3000 laptops.
0
jungleroomxIt's never too many graves, it's always not enough shovelsRegistered Userregular
Tbf the only reason M1 is only really in that space is because of who made it.
+3
OrcaAlso known as EspressosaurusWrexRegistered Userregular
I donno, once upon a time, at a very surface level, it was explained to me that x86's instruction set would always result in a less efficient architecture than the ARM instruction set. It's about how deep of a pipeline the sophistication of certain x86 instructions simply require, versus ARM which keeps things simple and shallow. Sure, it may take multiple ARM instructions to accomplish what a single x86 instruction can accomplish. But ARM's entire pipeline benefits from not having such monolithic instructions in the first place.
I have no idea if this is still the state of the art. I have no idea if ARM has suffered the same feature creep that x86 suffered. I have no idea if somehow the pipeline depth issues of x86 comparative to ARM really just all got abstracted away somehow. I don't know that I've ever seen a honest to god comparison if a truly demanding, high performance application, cross compiled for ARM and x86 running on as equivalent hardware as you could get when you are comparing architecture.
Instead, x86 and ARM just kind of went their different ways. The advantages of ARM in the mobile, low powered SOC space were assumed. And the advantages of x86 in the high performance desktop space were assumed. And M1 is really only challenging in the weird, almost not applicable to anything else middle space of boutique $2000-3000 laptops.
Except ARM doesn't keep things simple and shallow. It did at the dawn of time, but their modern instruction sets and cores are every bit as CISC as x86-64 (albeit much more readable). It's the underlying architecture and the silicon process it's built on that matters more than the instruction set at this point, assuming the instruction sets accelerate the same thing (not a guarantee). In a sense, RISC won since everyone uses uops now internally, it's just the instruction set that's exposed to the outside world that gets decoded into uops. And this has been true since the first Pentiums, so 25 years at this point.
ARM has their own SIMD instructions the way x86-64 has SSE and AVX, ARM has instructions that take multiple cycles to complete the way x86-64 has many many instructions that take multiple cycles to complete, etc. etc.
The failures in the past have been because people were trying to run consumer OSes on what amounted to shitty mobile ARM chips. Performance sucked, the application support wasn't there, and they generally flopped. Apple has done multiple architecture changes, so they know how to build that support into the OS so the user doesn't need to worry about it, and if they're going to release it, that means emulation has good enough performance people won't scream.
Apple's approach to their ecosystem is why the M1 is succeeding where attempts to make Windows on ARM a thing by Microsoft have failed miserably.
They attacked it from the OS platform side and from the silicon side, and while I hate what it presages, I have to give them credit for knowing how to handle this kind of transition masterfully.
Slap an Exynos processor in a laptop and boot Windows ARM64 on it, and you'll still have a pile of shit no one wants.
I donno, once upon a time, at a very surface level, it was explained to me that x86's instruction set would always result in a less efficient architecture than the ARM instruction set. It's about how deep of a pipeline the sophistication of certain x86 instructions simply require, versus ARM which keeps things simple and shallow. Sure, it may take multiple ARM instructions to accomplish what a single x86 instruction can
ARM has suffered feature creep as well, points at FJCVTZS ( Floating-point Javascript Convert to Signed fixed-point, rounding toward Zero. ).
The strength to me of ARM is it allows you to go CISC / specialized for the domains you care about instead of all the domains x86_64 cares about.
I don't know that I've ever seen a honest to god comparison if a truly demanding, high performance application, cross compiled for ARM and x86 running on as equivalent hardware as you could get when you are comparing architecture.
This is fairly common in HPC now that we have high performance ARM chips.
The Apple ARM chips are also the first time we're seeing, in a consumer product at least, what an ARM processor can do when you actually give it some power and thermal management to go with it. I'm not the target market for it but I'm super, super, super interested to see what the TDP of the M1 Max really is, and what that means for performance.
The 10W M1 in the MacBook Air and 15W M1 in the MacBook Pro 13 can already do some great things. Throw more wattage and cooling at those designs and I'll be interested to see what they can actually do.
ARM has their own SIMD instructions the way x86-64 has SSE and AVX, ARM has instructions that take multiple cycles to complete the way x86-64 has many many instructions that take multiple cycles to complete, etc. etc.
ARM has multiple SIMD instructions sets with NEON, SVE, and SVE2 and just like x86-64 which ones you have depends if you are designing a cellphone, laptop, or a supercomputer.
+1
OrcaAlso known as EspressosaurusWrexRegistered Userregular
ARM has their own SIMD instructions the way x86-64 has SSE and AVX, ARM has instructions that take multiple cycles to complete the way x86-64 has many many instructions that take multiple cycles to complete, etc. etc.
ARM has multiple SIMD instructions sets with NEON, SVE, and SVE2 and just like x86-64 which ones you have depends if you are designing a cellphone, laptop, or a supercomputer.
Yeah. It’s a question of which ISA you want and which optional vector and floating point accelerators you want, and how much silicon area and power you’re willing to spend to get it.
Apple’s architecture for the M1 is interesting—it’s like something halfway between an XBOX360 and a regular consumer PC in terms of its memory layout. For memory bandwidth constrained CPU workloads I bet it crushes on a per-watt basis.
Huh. Step-up (to 3080 FTW3) just popped on an EVGA 3070 I bought last year. I kinda figured by this point, step-ups were dead.
Honestly, if you're completely devoid of morals and willing to engage to capitalist skullduggery (as though there's any other kind), you might be better off just buying whatever new card is available and selling the old one at the opposite of a loss.
I got my EVGA RTX 3080 FTW3 Ultra with my new PC out of MicroCenter (a small assembly fee was a tiny price to pay for bypassing the waiting line for 30-series cards; not a fan of the motherboard they recommended, but admittedly, almost all motherboard software is terrible and the hardware is potent). That was barely six months ago; EVGA gave me the "opportunity" to buy an EVGA RTX 3080Ti FTW3 Ultra last week at their current MSRP, and I took it.
Fast forward a week, and the eBay auction I set up for my for my 3080 FTW3 (with its original box, mounting bracket accessories, not obliterating by crypto mining, etc.), has exceeded what I just paid for the 3080Ti FTW3. By a few hundred dollars. And the bidding hasn't technically ended yet.
I'm not proud of the fact (and given this is eBay, I could be screwed by a particularly nefarious buyer, though slavish obedience to eBay policy is a good way to avoid getting screwed by trollish refunding), but I did it anyway. I was just hoping to break even (as greedy as that sounds), much less make a profit. It's not something I'd recommend, but it's definitely an option.
EDIT: The fact that if I kept my GTX 970, and sold it today, I'd make very nearly what I did when I sold it on eBay upon getting my GTX 1080 Ti (not account for inflation, admittedly), really highlights how insane the situation is.
ARM has their own SIMD instructions the way x86-64 has SSE and AVX, ARM has instructions that take multiple cycles to complete the way x86-64 has many many instructions that take multiple cycles to complete, etc. etc.
ARM has multiple SIMD instructions sets with NEON, SVE, and SVE2 and just like x86-64 which ones you have depends if you are designing a cellphone, laptop, or a supercomputer.
Yeah. It’s a question of which ISA you want and which optional vector and floating point accelerators you want, and how much silicon area and power you’re willing to spend to get it.
Apple’s architecture for the M1 is interesting—it’s like something halfway between an XBOX360 and a regular consumer PC in terms of its memory layout. For memory bandwidth constrained CPU workloads I bet it crushes on a per-watt basis.
I fully agree, with the M1 Pro and M1 Max memory systems being closer to what GPU's do ( and xeon phi did ).
It seems like the problems that chip designers are solving for aren't integer or float instruction bound. So people are getting more high bandwidth memory closer to the compute. What is super cool is all the different approaches from Intel HBM2 approach, AMD's stacked 3D V-Cache, and Apple LPDDR5.
Posts
There hasn't been a processer besides X86 that could compare with X86 until the M1 stuff. Itanium withered in obscurity and died because it sucked. ARM as a whole has never taken off as a mass consumer market product because its used for low power things. RISC is the same. PowerPC never had a chance. X86 has survived since the 80's because nothing else is as good, and the M1 is the first chip outside X86 that actually competes.
Having an alternative architecture come up through the ranks to push X86 would be one thing, but I think what we're going to see is all of these companies (Intel, AMD, and Nvidia/ARM) make a play for the replacement of X86, especially since all of them would have the capability to push CPU's and GPU's for their own systems. Because that's how companies work now. This isn't going to play out like it did in the 1980's when it was the wild wild west and everyone and their mom could make a 3.1 mhz chip with transistors you could literally see in a regular microscope and Intel signed production deals with Cyrix and AMD.
And unlike before, with previous attempts to replace X86, all that has to happen now is a product has to be sold with it on board. Like, for instance, a laptop.
Don't get me wrong, I'm aware it's going to be a slow process. It's not going to be in 2-3 years. You couldn't even get the fabrication plants set up to make the chips in that time... but this ability to lock in consumers to a hardware ecosystem feels like the next step. It's all speculation, but nothing these companies have done in the past decade makes me feel warm and fuzzy about it.
Nah. It's been over a decade since TPM was introduced. The EFF and everyone made a huge deal out of it, and it stuck around anyway, and literally nothing bad has happened. Even hardcore free software folks like the Debian team assert that it's a tool for security, not a gambit for Microsoft to extinguish the (incredibly tiny) alternative OS user base.
Microsoft has continued to take advantage of hardware security features since then, and free software developers largely have not. The upshot is that desktop Linux is effectively a decade behind macOS and Windows in security. It's really frustrating.
Yeah apparently made out of that same rubberized stuff as their bar mats. They were too cool to pass up, especially since it's GN and they fight the good fight.
Unfortunately no.
Ones an exposed GPU core with mounting brackets and pcb hardware, ones a CPU socket with VRMs, and ones just the GN logo.
but
joke's on me, they're sold out
0/10!
Steam ID: Good Life
To be fair (to be faaaaaaiiiiiiirrrrr) it used to be software rendering. Now it's hardware rendering but the hardware has moved into the CPU
but I mean, you're still right
sigh
https://www.amd.com/en/support/chipsets/amd-socket-am4/x570
In keeping with all hardware these days.
Need to add in a queue system, lottery, or random date/time of stock refreshes to make it feel more real.
Steam: betsuni7
Parts of it I am ok with, like not having to buy drop in NICs or soundcards, but if cryptocurrencies and manufacturing/shipping delays continue the way they have been doing, and integrated video goes from being less than dogshit to actually kinda almost ok for casual gaming, and stuff like Gamepass keeps growing in popularity, I think there will definitely be a class of gaming PC that is sold in the future that is essentially just a console that runs Windows and plugs into a monitor instead of a TV. Maybe Valve was just a few years too early with their "Steam Machines" initiative.
I was curious about gaming on Apple, so I did a quick search for recommended games. The top games to play in 2021 included Portal 2 and Counter Strike Global Offensive.
Now, they're fantastic games, but Portal 2 is 10 years old and Global Offensive is 8 years old.
I'm guessing the available games on Mac situation is pretty dire?
Their GPU situation has been pretty dire for years now.
It's non-existent.
You don't buy an Apple to game. That's one of the few truisms we still have left in the industry. Windows is for games, Linux is catching up quickly, but Apple is for app store arcade games pretty much exclusively. As a company they've shown utter indifference to the gaming industry as a whole.
Pretty sure Apple are the largest “gaming” company by revenue aren’t they? If they can do that with just the App Store why would they the invest any more effort into the endeavor?
https://appleinsider.com/articles/21/10/03/apple-earned-more-from-gaming-than-sony-nintendo-microsoft-activision-combined/amp/
Mobile gaming only and exploiting creators on your App Store seems less like saying they’re the most profitable gaming company and more like a warning of what happens when you follow their business model.
And aside from a few exemplars, mobile gaming is a trash fire of exploitation, getting kids and adults addicted to gambling, and skinner boxes so complete they make console and PC gaming loot boxes seem like a less horrible business practice.
ran memtest86 and received no errors
some other google results brought up the "reliability monitor" feature, and it's showing several 'hardware error' flags, specifically livekernelevent 141. I can all-but prove these correspond with 'normal' freezes/crashes in Ancestors; there's one that might not.
this error code led me down a path of running furmark to stress test the gpu (Gigabyte-brand 1060 GTX 6 GB). It hit 80c in about 3 minutes, then slowly crept up to 88c. Furmark claimed my fan speed was steady at 70%. I stopped the test at minute 15, both to forestall the temps getting into the 90s and because the guide suggested I would have seen issues by that point. I did not see any artifacting (though tbh I don't know what I'm looking for, there). The info section reported no OpenGL errors. Obviously it didn't crash. No new flags appeared in the reliability monitor.
kind of running out of ideas on what to poke.
Neither quite, Metal is the same API used in their iDevices now. And its going to be exactly the same GPU family, with M1 already in some iPads.
So rather than trying to be the minority platform that dips into ports from PC games, they are instead trying to pull from their huge mobile library. So you're looking at either free games with horrible whale traps, or smaller-scale cheap games, all oriented towards a majority of systems where people engage in brief spurts and only pay half-attention. They can talk about how their mobile GPUs can outpace an Xbox One now but they sure aren't making it easy for anything you would see on an XB1 to come over.
"Walled garden" is way too nice of a euphemism for the crap that Apple has built.
Steam ID: Good Life
You know... I keep wondering if "architecture" is really the right word to get thrown around with X86. It's an instruction set. It's been implemented in radically different architectures over the years. There used to be something like a half dozen X86 vendors. AMD, Intel, Via, Cyrix, something called a "Winchip" and a "Rise Technology mP6". And this ecosystem of alternate x86 vendors only existed because of bizarre contractual and legal outcomes that arose from IBM losing control of the platform.
Now it's whittled down to Intel and AMD who, if I'm not mistaken, through assorted legal settlements have quasi perpetual licenses to each others assorted x86 extension IP?
But then you also have Windows as the relatively open OS, with roots in a pre-internet age where there was, relative to today at least, little risk in letting you run whatever software you wanted. If you got a virus, your computer wasn't networked, and you were just a full format away from getting back where you started after maybe a day's work.
It was always the openness and competitiveness of the x86 platform and it's operating systems that made it thrive and outshine other, theoretically technically superior, platforms. Even if, and it's a big if, you were able to compare apples to apples applications running on the M1 right now, and the M1 performed better, it is inconceivable to me that Apple would keep up the effort. It's another monopolized walled garden, and they'd actively compete on the technicals only long enough to lock in a market share with proprietary bullshit, and then they'd rest on their laurels. And if no one else is making M1 compatible hardware, you can pretty much go fuck yourself if you don't like it.
I'm not sure what the state of the art of M1 hacking is. I know it's ARM based, I don't know how well other operating systems that are designed to run on ARM can run on it. I don't know if "ARM based" are weasel words, and how much proprietary, ARM-incompatible crap it has. In general, for reasons I've never really investigated, you don't see a lot of broad cross compatibility among ARM software ecosystems. But maybe that's my own ignorance. I got curious and apparently you can install Android on a Raspberry Pi? It's just really awkward? Like a Hackintosh?
I donno. I have a really dismal outlook on the future of computing, and every which direction I look, I see the platform being turned into a walled garden. I have all kinds of conspiracy theories about how it could happen. And theories about how TSMC will be taken off the field. And theories about how those two forces will interweave to really fuck everyone.
It's concerned me enough to actually put my money where my mouth is and refuse to invest in TSMC, or any company reliant on TSMC. Been buying
But who knows. Maybe a relatively open ARM/Linux based future will preserve open computing, even if it's utility dwindles.
There's still literally every other peripheral you need to deal with and Apple likes putting secure enclaves in everything (I mean, they do have a user benefit, but it also helps them keep their walled garden). So compatibility isn't as simple as just the instruction set. It's the entire platform you need to contend with, and that's a big, complicated beast.
yes, an Intel 11900k is not the same architecture as a Pentium II 450Mhz processor. But they both run on the x86 instruction set.
When we talk about the inefficiency of the x86 architecture, we're really talking about it as an instruction set. So yes, they're different things, but for all practical purposes in how we talk about it, we talk about x86 as an architecture. It is kind of a distinction without a difference.
As for Macs and their walled garden. I'm typing this on a 13" M1 MacBook pro which is my favorite laptop I've ever used in terms of hardware, I just don't like MacOS very much. But this is a work machine so it is what I use for work.
Apple has always been a walled garden. But they've made allowances for when they needed to. When they switched to Intel, OS X was still a walled garden, but they had to allow Windows in boot camp because there was an app gap and some people needed Windows apps to do what they did. Allowing people to install Windows on their Intel Macs just got people buying more Macs. Then people would use OS X more, and the goal was to slowly get people using OS X instead of Windows over time, so when they needed to replace their Mac, they were buying another mac. And then maybe they didn't even need Windows by then, and were living in the Apple ecosystem entirely.
Then with the rise of the iPhone in North America, they were able to make that walled garden tighter. And they continue to do so to this day.
And Apple is not alone in this. Google is trying to do it with its own products as well, albeit much less successfully, but I guarantee you that if Google could figure out a way to take Android away from everything but the Pixel, they would.
Every company wants you to buy their products, and only their products. Apple just happens to be the most successful company at it.
Inefficiency of the instruction set is an issue, but not one that is relevant anymore. It was relevant when the decoder was enormous (because gigantic instructions) and took up a bunch of the silicon. These days? Eh, who gives a shit, it's all uops under the hood anyway. The decoder is a tiny part of the silicon. Still a pain in the ass for the poor bastards that have to write/read the assembly, but that doesn't matter to someone using the program. Or for most programmers for that matter.
So talking about efficiency when it comes to instruction sets has little point. There is a point to talking about efficiency between architectures however, and that's where we can start looking at the high power utilization to perform the same computation across parts--and for a surprisingly wide variety of cases, Apple has made something special when it comes to efficient-yet-speedy laptop processsors. Intel on the other hand is still hamstrung by an obsolete design and silicon fab tech, so it takes them a bunch more power to do the same computation. AMD is looking up, but their emphasis so far has been on high power, high performance chips where Apple isn't really competing yet.
Google will likely never take Android away from everything but the Pixel. It was their strategic choice to make sure platform owners like Microsoft and Apple couldn't lock them out of search/ads, and now they're making a bundle off the app store. It helps them that they are the only choice for other companies if they want to compete in the cellphone space. They will only consider walling that off if something huge changes in their strategy.
I have no idea if this is still the state of the art. I have no idea if ARM has suffered the same feature creep that x86 suffered. I have no idea if somehow the pipeline depth issues of x86 comparative to ARM really just all got abstracted away somehow. I don't know that I've ever seen a honest to god comparison if a truly demanding, high performance application, cross compiled for ARM and x86 running on as equivalent hardware as you could get when you are comparing architecture.
Instead, x86 and ARM just kind of went their different ways. The advantages of ARM in the mobile, low powered SOC space were assumed. And the advantages of x86 in the high performance desktop space were assumed. And M1 is really only challenging in the weird, almost not applicable to anything else middle space of boutique $2000-3000 laptops.
Except ARM doesn't keep things simple and shallow. It did at the dawn of time, but their modern instruction sets and cores are every bit as CISC as x86-64 (albeit much more readable). It's the underlying architecture and the silicon process it's built on that matters more than the instruction set at this point, assuming the instruction sets accelerate the same thing (not a guarantee). In a sense, RISC won since everyone uses uops now internally, it's just the instruction set that's exposed to the outside world that gets decoded into uops. And this has been true since the first Pentiums, so 25 years at this point.
ARM has their own SIMD instructions the way x86-64 has SSE and AVX, ARM has instructions that take multiple cycles to complete the way x86-64 has many many instructions that take multiple cycles to complete, etc. etc.
The failures in the past have been because people were trying to run consumer OSes on what amounted to shitty mobile ARM chips. Performance sucked, the application support wasn't there, and they generally flopped. Apple has done multiple architecture changes, so they know how to build that support into the OS so the user doesn't need to worry about it, and if they're going to release it, that means emulation has good enough performance people won't scream.
Apple's approach to their ecosystem is why the M1 is succeeding where attempts to make Windows on ARM a thing by Microsoft have failed miserably.
They attacked it from the OS platform side and from the silicon side, and while I hate what it presages, I have to give them credit for knowing how to handle this kind of transition masterfully.
Slap an Exynos processor in a laptop and boot Windows ARM64 on it, and you'll still have a pile of shit no one wants.
ARM has suffered feature creep as well, points at FJCVTZS ( Floating-point Javascript Convert to Signed fixed-point, rounding toward Zero. ).
The strength to me of ARM is it allows you to go CISC / specialized for the domains you care about instead of all the domains x86_64 cares about.
This is fairly common in HPC now that we have high performance ARM chips.
The 10W M1 in the MacBook Air and 15W M1 in the MacBook Pro 13 can already do some great things. Throw more wattage and cooling at those designs and I'll be interested to see what they can actually do.
ARM has multiple SIMD instructions sets with NEON, SVE, and SVE2 and just like x86-64 which ones you have depends if you are designing a cellphone, laptop, or a supercomputer.
Yeah. It’s a question of which ISA you want and which optional vector and floating point accelerators you want, and how much silicon area and power you’re willing to spend to get it.
Apple’s architecture for the M1 is interesting—it’s like something halfway between an XBOX360 and a regular consumer PC in terms of its memory layout. For memory bandwidth constrained CPU workloads I bet it crushes on a per-watt basis.
Honestly, if you're completely devoid of morals and willing to engage to capitalist skullduggery (as though there's any other kind), you might be better off just buying whatever new card is available and selling the old one at the opposite of a loss.
I got my EVGA RTX 3080 FTW3 Ultra with my new PC out of MicroCenter (a small assembly fee was a tiny price to pay for bypassing the waiting line for 30-series cards; not a fan of the motherboard they recommended, but admittedly, almost all motherboard software is terrible and the hardware is potent). That was barely six months ago; EVGA gave me the "opportunity" to buy an EVGA RTX 3080Ti FTW3 Ultra last week at their current MSRP, and I took it.
Fast forward a week, and the eBay auction I set up for my for my 3080 FTW3 (with its original box, mounting bracket accessories, not obliterating by crypto mining, etc.), has exceeded what I just paid for the 3080Ti FTW3. By a few hundred dollars. And the bidding hasn't technically ended yet.
I'm not proud of the fact (and given this is eBay, I could be screwed by a particularly nefarious buyer, though slavish obedience to eBay policy is a good way to avoid getting screwed by trollish refunding), but I did it anyway. I was just hoping to break even (as greedy as that sounds), much less make a profit. It's not something I'd recommend, but it's definitely an option.
EDIT: The fact that if I kept my GTX 970, and sold it today, I'd make very nearly what I did when I sold it on eBay upon getting my GTX 1080 Ti (not account for inflation, admittedly), really highlights how insane the situation is.
I fully agree, with the M1 Pro and M1 Max memory systems being closer to what GPU's do ( and xeon phi did ).
It seems like the problems that chip designers are solving for aren't integer or float instruction bound. So people are getting more high bandwidth memory closer to the compute. What is super cool is all the different approaches from Intel HBM2 approach, AMD's stacked 3D V-Cache, and Apple LPDDR5.